Science.gov

Sample records for advanced statistical techniques

  1. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

    DTIC Science & Technology

    2014-07-12

    based ground penetrating radars for the detection of subsurface objects that are low in metal content and hard to detect. The derived techniques...penetrating radars for the detection of subsurface objects that are low in metal content and hard to detect. The derived techniques include the exploitation...5.00 4.00 3.00 9.00 T. Glenn, J. Wilson, D. Ho. A MULTIMODAL MATCHING PURSUITS DISSIMILARITY MEASURE APPLIED TO LANDMINE/CLUTTER DISCRIMINATION

  2. Advanced statistics: applying statistical process control techniques to emergency medicine: a primer for providers.

    PubMed

    Callahan, Charles D; Griffen, David L

    2003-08-01

    Emergency medicine faces unique challenges in the effort to improve efficiency and effectiveness. Increased patient volumes, decreased emergency department (ED) supply, and an increased emphasis on the ED as a diagnostic center have contributed to poor customer satisfaction and process failures such as diversion/bypass. Statistical process control (SPC) techniques developed in industry offer an empirically based means to understand our work processes and manage by fact. Emphasizing that meaningful quality improvement can occur only when it is exercised by "front-line" providers, this primer presents robust yet accessible SPC concepts and techniques for use in today's ED.

  3. Classification of human colonic tissues using FTIR spectra and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Zwielly, A.; Argov, S.; Salman, A.; Bogomolny, E.; Mordechai, S.

    2010-04-01

    One of the major public health hazards is colon cancer. There is a great necessity to develop new methods for early detection of cancer. If colon cancer is detected and treated early, cure rate of more than 90% can be achieved. In this study we used FTIR microscopy (MSP), which has shown a good potential in the last 20 years in the fields of medical diagnostic and early detection of abnormal tissues. Large database of FTIR microscopic spectra was acquired from 230 human colonic biopsies. Five different subgroups were included in our database, normal and cancer tissues as well as three stages of benign colonic polyps, namely, mild, moderate and severe polyps which are precursors of carcinoma. In this study we applied advanced mathematical and statistical techniques including principal component analysis (PCA) and linear discriminant analysis (LDA), on human colonic FTIR spectra in order to differentiate among the mentioned subgroups' tissues. Good classification accuracy between normal, polyps and cancer groups was achieved with approximately 85% success rate. Our results showed that there is a great potential of developing FTIR-micro spectroscopy as a simple, reagent-free viable tool for early detection of colon cancer in particular the early stages of premalignancy among the benign colonic polyps.

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  5. Advanced echocardiographic techniques

    PubMed Central

    Perry, Rebecca

    2015-01-01

    Abstract Echocardiography has advanced significantly since its first clinical use. The move towards more accurate imaging and quantification has driven this advancement. In this review, we will briefly focus on three distinct but important recent advances, three‐dimensional (3D) echocardiography, contrast echocardiography and myocardial tissue imaging. The basic principles of these techniques will be discussed as well as current and future clinical applications. PMID:28191159

  6. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  7. COLLABORATIVE RESEARCH:USING ARM OBSERVATIONS & ADVANCED STATISTICAL TECHNIQUES TO EVALUATE CAM3 CLOUDS FOR DEVELOPMENT OF STOCHASTIC CLOUD-RADIATION

    SciTech Connect

    Somerville, Richard

    2013-08-22

    The long-range goal of several past and current projects in our DOE-supported research has been the development of new and improved parameterizations of cloud-radiation effects and related processes, using ARM data, and the implementation and testing of these parameterizations in global models. The main objective of the present project being reported on here has been to develop and apply advanced statistical techniques, including Bayesian posterior estimates, to diagnose and evaluate features of both observed and simulated clouds. The research carried out under this project has been novel in two important ways. The first is that it is a key step in the development of practical stochastic cloud-radiation parameterizations, a new category of parameterizations that offers great promise for overcoming many shortcomings of conventional schemes. The second is that this work has brought powerful new tools to bear on the problem, because it has been a collaboration between a meteorologist with long experience in ARM research (Somerville) and a mathematician who is an expert on a class of advanced statistical techniques that are well-suited for diagnosing model cloud simulations using ARM observations (Shen).

  8. Large ensemble modeling of the last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert

    2016-05-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.

  9. Analysis of the two-dimensional turbulence in pure electron plasmas by means of advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Romé, M.; Lepreti, F.; Maero, G.; Pozzoli, R.; Vecchio, A.; Carbone, V.

    2013-03-01

    Highly magnetized, pure electron plasmas confined in a Penning-Malmberg trap allow one to perform experiments on the two-dimensional (2D) fluid dynamics under conditions where non-ideal effects are almost negligible. Recent results on the freely decaying 2D turbulence obtained from experiments with electron plasmas performed in the Penning-Malmberg trap ELTRAP are presented. The analysis has been applied to experimental sequences with different types of initial density distributions. The dynamical properties of the system have been investigated by means of wavelet transforms and Proper Orthogonal Decomposition (POD). The wavelet analysis shows that most of the enstrophy is contained at spatial scales corresponding to the typical size of the persistent vortices in the 2D electron plasma flow. The POD analysis allows one to identify the coherent structures which give the dominant contribution to the plasma evolution. The statistical properties of the turbulence have been investigated by means of Probability Density Functions (PDFs) and structure functions of spatial vorticity increments. The analysis evidences how the shape and evolution of the dominant coherent structures and the intermittency properties of the turbulence strongly depend on the initial conditions for the electron density.

  10. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  11. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  12. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  13. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of...

  14. Intermediate/Advanced Research Design and Statistics

    NASA Technical Reports Server (NTRS)

    Ploutz-Snyder, Robert

    2009-01-01

    The purpose of this module is To provide Institutional Researchers (IRs) with an understanding of the principles of advanced research design and the intermediate/advanced statistical procedures consistent with such designs

  15. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques....

  16. Advanced qualification techniques

    SciTech Connect

    Winokur, P.S; Shaneyfelt, M.R.; Meisenheimer, T.L.; Fleetwood, D.M.

    1993-12-01

    This paper demonstrates use of the Qualified Manufacturers List (QML) methodology to qualify commercial and military microelectronics for use in space applications. QML ``builds in`` the hardness of product through statistical process control (SPC) of technology parameters relevant to the radiation response, test structure to integrated circuit (IC) correlations, and techniques for extrapolating laboratory test results to low-dose-rate space scenarios. Each of these elements is demonstrated and shown to be a cost-effective alternative to expensive end-of-line IC testing. Several examples of test structured-IC correlations are provided and recent work on complications arising from transistor scaling and geometry is discussed. The use of a 10-keV x-ray wafer-level test system to support SPC and establish ``process capability`` is illustrated and a comparison of 10-keV x-ray and Co{sup 60} gamma irradiations is provided for a wide range of CMOS technologies. The x-ray tester is shown to be cost-effective and its use in lot acceptance/qualification is recommended. Finally, a comparison is provided between MIL-STD-883D, Test Method 1019.4, which governs the testing of packaged semiconductor microcircuits in the DoD, and ESA/SSC Basic Specification No. 22900, Europe`s Total Dose Steady-State Irradiation Test Method. Test Method 1019.4 focuses on conservative estimates of MOS hardness for space and tactical applications, while Basic Specification 22900 focuses on improved simulation of low-dose-rate space environments.

  17. Tools & techniques--statistics: propensity score techniques.

    PubMed

    da Costa, Bruno R; Gahl, Brigitta; Jüni, Peter

    2014-10-01

    Propensity score (PS) techniques are useful if the number of potential confounding pretreatment variables is large and the number of analysed outcome events is rather small so that conventional multivariable adjustment is hardly feasible. Only pretreatment characteristics should be chosen to derive PS, and only when they are probably associated with outcome. A careful visual inspection of PS will help to identify areas of no or minimal overlap, which suggests residual confounding, and trimming of the data according to the distribution of PS will help to minimise residual confounding. Standardised differences in pretreatment characteristics provide a useful check of the success of the PS technique employed. As with conventional multivariable adjustment, PS techniques cannot account for confounding variables that are not or are only imperfectly measured, and no PS technique is a substitute for an adequately designed randomised trial.

  18. Advancement on Visualization Techniques

    DTIC Science & Technology

    1980-10-01

    Aeroa and As ronautics Massachusetts Institute of Technology Cambridge, MA 02139 USA I !ii 1 I This AGARDograph was prepared at the request of the...the fields of science § and technology relating to aerospace for the following purposes: - Exchanging of scientific and technical information...Techniques for providing the pilot visualization have grown rapidly. Technology has developed fron mechanical gauges through electro-mechanical

  19. Advanced Coating Removal Techniques

    NASA Technical Reports Server (NTRS)

    Seibert, Jon

    2006-01-01

    An important step in the repair and protection against corrosion damage is the safe removal of the oxidation and protective coatings without further damaging the integrity of the substrate. Two such methods that are proving to be safe and effective in this task are liquid nitrogen and laser removal operations. Laser technology used for the removal of protective coatings is currently being researched and implemented in various areas of the aerospace industry. Delivering thousands of focused energy pulses, the laser ablates the coating surface by heating and dissolving the material applied to the substrate. The metal substrate will reflect the laser and redirect the energy to any remaining protective coating, thus preventing any collateral damage the substrate may suffer throughout the process. Liquid nitrogen jets are comparable to blasting with an ultra high-pressure water jet but without the residual liquid that requires collection and removal .As the liquid nitrogen reaches the surface it is transformed into gaseous nitrogen and reenters the atmosphere without any contamination to surrounding hardware. These innovative technologies simplify corrosion repair by eliminating hazardous chemicals and repetitive manual labor from the coating removal process. One very significant advantage is the reduction of particulate contamination exposure to personnel. With the removal of coatings adjacent to sensitive flight hardware, a benefit of each technique for the space program is that no contamination such as beads, water, or sanding residue is left behind when the job is finished. One primary concern is the safe removal of coatings from thin aluminum honeycomb face sheet. NASA recently conducted thermal testing on liquid nitrogen systems and found that no damage occurred on 1/6", aluminum substrates. Wright Patterson Air Force Base in conjunction with Boeing and NASA is currently testing the laser remOval technique for process qualification. Other applications of liquid

  20. Advanced Wavefront Control Techniques

    SciTech Connect

    Olivier, S S; Brase, J M; Avicola, K; Thompson, C A; Kartz, M W; Winters, S; Hartley, R; Wihelmsen, J; Dowla, F V; Carrano, C J; Bauman, B J; Pennington, D M; Lande, D; Sawvel, R M; Silva, D A; Cooke, J B; Brown, C G

    2001-02-21

    this project, work was performed in four areas (1) advanced modeling tools for deformable mirrors (2) low-order wavefront correctors with Alvarez lenses, (3) a direct phase measuring heterdyne wavefront sensor, and (4) high-spatial-frequency wavefront control using spatial light modulators.

  1. A Comparison of the Performance of Advanced Statistical Techniques for the Refinement of Day-ahead and Longer NWP-based Wind Power Forecasts

    NASA Astrophysics Data System (ADS)

    Zack, J. W.

    2015-12-01

    Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble

  2. Techniques in Advanced Language Teaching.

    ERIC Educational Resources Information Center

    Ager, D. E.

    1967-01-01

    For ease of presentation, advanced grammar teaching techniques are briefly considered under the headings of structuralism (belief in the effectiveness of presenting grammar rules) and contextualism (belief in the maximum use by students of what they know in the target language). The structuralist's problem of establishing a syllabus is discussed…

  3. LHC Olympics: Advanced Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Armour, Kyle; Larkoski, Andrew; Gray, Amanda; Ventura, Dan; Walsh, Jon; Schabinger, Rob

    2006-05-01

    The LHC Olympics is a series of workshop aimed at encouraging theorists and experimentalists to prepare for the soon-to-be-online Large Hadron Collider in Geneva, Switzerland. One aspect of the LHC Olympics program consists of the study of simulated data sets which represent various possible new physics signals as they would be seen in LHC detectors. Through this exercise, LHC Olympians learn the phenomenology of possible new physics models and gain experience in analyzing LHC data. Additionally, the LHC Olympics encourages discussion between theorists and experimentalists, and through this collaboration new techniques could be developed. The University of Washington LHC Olympics group consists of several first-year graduate and senior undergraduate students, in both theoretical and experimental particle physics. Presented here is a discussion of some of the more advanced techniques used and the recent results of one such LHC Olympics study.

  4. Statistical and Economic Techniques for Site-specific Nematode Management.

    PubMed

    Liu, Zheng; Griffin, Terry; Kirkpatrick, Terrence L

    2014-03-01

    Recent advances in precision agriculture technologies and spatial statistics allow realistic, site-specific estimation of nematode damage to field crops and provide a platform for the site-specific delivery of nematicides within individual fields. This paper reviews the spatial statistical techniques that model correlations among neighboring observations and develop a spatial economic analysis to determine the potential of site-specific nematicide application. The spatial econometric methodology applied in the context of site-specific crop yield response contributes to closing the gap between data analysis and realistic site-specific nematicide recommendations and helps to provide a practical method of site-specifically controlling nematodes.

  5. Statistical problems in design technique validation

    SciTech Connect

    Cohen, J.S.

    1980-04-01

    This work is concerned with the statistical validation process for measuring the accuracy of design techniques for solar energy systems. This includes a discussion of the statistical variability inherent in the design and measurement processes and the way in which this variability can dictate the choice of experimental design, choice of data, accuracy of the results, and choice of questions that can be reliably answered in such a study. The approach here is primarily concerned with design procedure validation in the context of the realistic process of system desig, where the discrepancy between measured and predicted results is due to limitations in the mathematical models employed by the procedures and the inaccuracies of input data. A set of guidelines for successful validation methodologies is discussed, and a simplified validation methodology for domestic hot water heaters is presented.

  6. Enhanced bio-manufacturing through advanced multivariate statistical technologies.

    PubMed

    Martin, E B; Morris, A J

    2002-11-13

    The paper describes the interrogation of data, from a reaction vessel producing an active pharmaceutical ingredient (API), using advanced multivariate statistical techniques. Due to the limited number of batches available, data augmentation was used to increase the number of batches thereby enabling the extraction of more subtle process behaviour from the data. A second methodology investigated was that of multi-group modelling. This allowed between cluster variability to be removed, thus allowing attention to focus on within process variability. The paper describes how the different approaches enabled the realisation of a better understanding of the factors causing the onset of an impurity formation to be obtained as well demonstrating the power of multivariate statistical data analysis techniques to provide an enhanced understanding of the process.

  7. Advanced techniques for microwave reflectometry

    SciTech Connect

    Sanchez, J.; Branas, B.; Luna, E. de la; Estrada, T.; Zhuravlev, V. |; Hartfuss, H.J.; Hirsch, M.; Geist, T.; Segovia, J.; Oramas, J.L.

    1994-12-31

    Microwave reflectometry has been applied during the last years as a plasma diagnostic of increasing interest, mainly due to its simplicity, no need for large access ports and low radiation damage of exposed components. Those characteristics make reflectometry an attractive diagnostic for the next generation devices. Systems used either for density profile or density fluctuations have also shown great development, from the original single channel heterodyne to the multichannel homodyne receivers. In the present work we discuss three different advanced reflectometer systems developed by CIEMAT members in collaboration with different institutions. The first one is the broadband heterodyne reflectometer installed on W7AS for density fluctuations measurements. The decoupling of the phase and amplitude of the reflected beam allows for quantitative analysis of the fluctuations. Recent results showing the behavior of the density turbulence during the L-H transition on W7AS are shown. The second system shows how the effect of the turbulence can be used for density profile measurements by reflectometry in situations where the complicated geometry of the waveguides cannot avoid many parasitic reflections. Experiments from the TJ-I tokamak will be shown. Finally, a reflectometer system based on the Amplitude Modulation (AM) technique for density profile measurements is discussed and experimental results from the TJ-I tokamak are shown. The AM system offers the advantage of being almost insensitive to the effect of fluctuations. It is able to take a direct measurement of the time delay of the microwave pulse which propagates to the reflecting layer and is reflected back. In order to achieve fast reconstruction for real time monitoring of the density profile application of Neural Networks algorithms will be presented the method can reduce the computing times by about three orders of magnitude. 10 refs., 10 figs.

  8. Advances in Procedural Techniques - Antegrade

    PubMed Central

    Wilson, William; Spratt, James C.

    2014-01-01

    There have been many technological advances in antegrade CTO PCI, but perhaps most importantly has been the evolution of the “hybrid’ approach where ideally there exists a seamless interplay of antegrade wiring, antegrade dissection re-entry and retrograde approaches as dictated by procedural factors. Antegrade wire escalation with intimal tracking remains the preferred initial strategy in short CTOs without proximal cap ambiguity. More complex CTOs, however, usually require either a retrograde or an antegrade dissection re-entry approach, or both. Antegrade dissection re-entry is well suited to long occlusions where there is a healthy distal vessel and limited “interventional” collaterals. Early use of a dissection re-entry strategy will increase success rates, reduce complications, and minimise radiation exposure, contrast use as well as procedural times. Antegrade dissection can be achieved with a knuckle wire technique or the CrossBoss catheter whilst re-entry will be achieved in the most reproducible and reliable fashion by the Stingray balloon/wire. It should be avoided where there is potential for loss of large side branches. It remains to be seen whether use of newer dissection re-entry strategies will be associated with lower restenosis rates compared with the more uncontrolled subintimal tracking strategies such as STAR and whether stent insertion in the subintimal space is associated with higher rates of late stent malapposition and stent thrombosis. It is to be hoped that the algorithms, which have been developed to guide CTO operators, allow for a better transfer of knowledge and skills to increase uptake and acceptance of CTO PCI as a whole. PMID:24694104

  9. Air Quality Forecasting through Different Statistical and Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Mishra, D.; Goyal, P.

    2014-12-01

    Urban air pollution forecasting has emerged as an acute problem in recent years because there are sever environmental degradation due to increase in harmful air pollutants in the ambient atmosphere. In this study, there are different types of statistical as well as artificial intelligence techniques are used for forecasting and analysis of air pollution over Delhi urban area. These techniques are principle component analysis (PCA), multiple linear regression (MLR) and artificial neural network (ANN) and the forecasting are observed in good agreement with the observed concentrations through Central Pollution Control Board (CPCB) at different locations in Delhi. But such methods suffers from disadvantages like they provide limited accuracy as they are unable to predict the extreme points i.e. the pollution maximum and minimum cut-offs cannot be determined using such approach. Also, such methods are inefficient approach for better output forecasting. But with the advancement in technology and research, an alternative to the above traditional methods has been proposed i.e. the coupling of statistical techniques with artificial Intelligence (AI) can be used for forecasting purposes. The coupling of PCA, ANN and fuzzy logic is used for forecasting of air pollutant over Delhi urban area. The statistical measures e.g., correlation coefficient (R), normalized mean square error (NMSE), fractional bias (FB) and index of agreement (IOA) of the proposed model are observed in better agreement with the all other models. Hence, the coupling of statistical and artificial intelligence can be use for the forecasting of air pollutant over urban area.

  10. Deterministic and Advanced Statistical Modeling of Wind-Driven Sea

    DTIC Science & Technology

    2015-07-06

    COVERED (From - To) 01/09/2010-06/07/2015 4. TITLE AND SUBTITLE Deterministic and advanced statistical modeling of wind-driven sea 5a. CONTRACT...Technical Report Deterministic and advanced statistical modeling of wind-driven sea Vladimir Zakharov, Andrei Pushkarev Waves and Solitons LLC, 1719 W...Development of accurate and fast advanced statistical and dynamical nonlinear models of ocean surface waves, based on first physical principles, which will

  11. Writing to Learn Statistics in an Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Northrup, Christian Glenn

    2012-01-01

    This study investigated the use of writing in a statistics classroom to learn if writing provided a rich description of problem-solving processes of students as they solved problems. Through analysis of 329 written samples provided by students, it was determined that writing provided a rich description of problem-solving processes and enabled…

  12. Advanced Spectroscopy Technique for Biomedicine

    NASA Astrophysics Data System (ADS)

    Zhao, Jianhua; Zeng, Haishan

    This chapter presents an overview of the applications of optical spectroscopy in biomedicine. We focus on the optical design aspects of advanced biomedical spectroscopy systems, Raman spectroscopy system in particular. Detailed components and system integration are provided. As examples, two real-time in vivo Raman spectroscopy systems, one for skin cancer detection and the other for endoscopic lung cancer detection, and an in vivo confocal Raman spectroscopy system for skin assessment are presented. The applications of Raman spectroscopy in cancer diagnosis of the skin, lung, colon, oral cavity, gastrointestinal tract, breast, and cervix are summarized.

  13. Stitching Techniques Advance Optics Manufacturing

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Because NASA depends on the fabrication and testing of large, high-quality aspheric (nonspherical) optics for applications like the James Webb Space Telescope, it sought an improved method for measuring large aspheres. Through Small Business Innovation Research (SBIR) awards from Goddard Space Flight Center, QED Technologies, of Rochester, New York, upgraded and enhanced its stitching technology for aspheres. QED developed the SSI-A, which earned the company an R&D 100 award, and also developed a breakthrough machine tool called the aspheric stitching interferometer. The equipment is applied to advanced optics in telescopes, microscopes, cameras, medical scopes, binoculars, and photolithography."

  14. Advanced Geophysical Environmental Simulation Techniques

    DTIC Science & Technology

    2007-11-02

    cloud property retrieval algorithms for processing of large multiple-satellite data sets; development and application of improved cloud -phase and... cloud optical property retrieval algorithms; investigation of techniques potentially applicable for retrieval of cloud spatial properties from very...14. SUBJECT TERMS cirrus cloud retrieval satellite meteorology polar-orbiting geostationary 15. NUMBER OF PAGES 16. PRICE CODE 17. SECURITY

  15. Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge

    ERIC Educational Resources Information Center

    Haines, Brenna

    2015-01-01

    The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…

  16. Application of multivariate statistical techniques in microbial ecology.

    PubMed

    Paliy, O; Shankar, V

    2016-03-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large-scale ecological data sets. In particular, noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amount of data, powerful statistical techniques of multivariate analysis are well suited to analyse and interpret these data sets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular data set. In this review, we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and data set structure.

  17. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  18. Septoplasty: Basic and Advanced Techniques.

    PubMed

    Most, Sam P; Rudy, Shannon F

    2017-05-01

    Nasal septal deviation is a prevalent problem that can have significant quality of life ramifications. Septoplasty is commonly performed to provide qualitative and quantitative benefit to those with nasal obstruction owing to septal deviation. Although a standard, basic technique is often adequate for individuals with mild to moderate mid to posterior septal deviation, unique challenges arise with caudal septal deviation. Herein, multiple strategies that attempt to address anterior septal deviation are discussed. Anterior septal reconstruction has been shown to be a safe and effective means by which to address severe caudal septal deviation and long-term reduction in preoperative symptoms.

  19. Advanced Algorithms and Statistics for MOS Surveys

    NASA Astrophysics Data System (ADS)

    Bolton, A. S.

    2016-10-01

    This paper presents an individual view on the current state of computational data processing and statistics for inference and discovery in multi-object spectroscopic surveys, supplemented by a historical perspective and a few present-day applications. It is more op-ed than review, and hopefully more readable as a result.

  20. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  1. Statistical Analysis Techniques for Small Sample Sizes

    NASA Technical Reports Server (NTRS)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  2. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  3. Selected Bibliography on Optimizing Techniques in Statistics

    DTIC Science & Technology

    1981-08-01

    1958). Iterative solutions of likelihood equations, Biometrika 14, 128-130. Unland, A. W. and Smith , W. N. (1959). The use of Lagrange multipliers...373. IKubicek, M. Marek, M. and Eckert E. (1971). Quasilinearized regression, Technometrics 13 (3), 601-608. I Smith , F. B. and ham, D. F. (1971). Pm...parameter, J. Amer. Statist. Ass. 67, 641-646. Theobald , C. M. (1975). An inequality with application to nultivariate analysis, Bicmetrika 62, 461-466

  4. Recent advances in statistical energy analysis

    NASA Technical Reports Server (NTRS)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  5. A "Jigsaw Classroom" Technique for Undergraduate Statistics Courses.

    ERIC Educational Resources Information Center

    Perkins, David V.; Saris, Renee N.

    2001-01-01

    Describes an activity called the jigsaw classroom technique for use with working groups of undergraduate statistics students that divides a worksheet into two to four steps. Reveals that the students viewed the technique positively because it helped them understand statistical procedure and offered a variety of learning experiences. (CMK)

  6. RBF-based technique for statistical demodulation of pathological tremor.

    PubMed

    Gianfelici, Francesco

    2013-10-01

    This paper presents an innovative technique based on the joint approximation capabilities of radial basis function (RBF) networks and the estimation capability of the multivariate iterated Hilbert transform (IHT) for the statistical demodulation of pathological tremor from electromyography (EMG) signals in patients with Parkinson's disease. We define a stochastic model of the multichannel high-density surface EMG by means of the RBF networks applied to the reconstruction of the stochastic process (characterizing the disease) modeled by the multivariate relationships generated by the Karhunen-Loéve transform in Hilbert spaces. Next, we perform a demodulation of the entire random field by means of the estimation capability of the multivariate IHT in a statistical setting. The proposed method is applied to both simulated signals and data recorded from three Parkinsonian patients and the results show that the amplitude modulation components of the tremor oscillation can be estimated with signal-to-noise ratio close to 30 dB with root-mean-square error for the estimates of the tremor instantaneous frequency. Additionally, the comparisons with a large number of techniques based on all the combinations of the RBF, extreme learning machine, backpropagation, support vector machine used in the first step of the algorithm; and IHT, empirical mode decomposition, multiband energy separation algorithm, periodic algebraic separation and energy demodulation used in the second step of the algorithm, clearly show the effectiveness of our technique. These results show that the proposed approach is a potential useful tool for advanced neurorehabilitation technologies that aim at tremor characterization and suppression.

  7. Predicting radiotherapy outcomes using statistical learning techniques.

    PubMed

    El Naqa, Issam; Bradley, Jeffrey D; Lindsay, Patricia E; Hope, Andrew J; Deasy, Joseph O

    2009-09-21

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  8. Predicting radiotherapy outcomes using statistical learning techniques

    NASA Astrophysics Data System (ADS)

    El Naqa, Issam; Bradley, Jeffrey D.; Lindsay, Patricia E.; Hope, Andrew J.; Deasy, Joseph O.

    2009-09-01

    Radiotherapy outcomes are determined by complex interactions between treatment, anatomical and patient-related variables. A common obstacle to building maximally predictive outcome models for clinical practice is the failure to capture potential complexity of heterogeneous variable interactions and applicability beyond institutional data. We describe a statistical learning methodology that can automatically screen for nonlinear relations among prognostic variables and generalize to unseen data before. In this work, several types of linear and nonlinear kernels to generate interaction terms and approximate the treatment-response function are evaluated. Examples of institutional datasets of esophagitis, pneumonitis and xerostomia endpoints were used. Furthermore, an independent RTOG dataset was used for 'generalizabilty' validation. We formulated the discrimination between risk groups as a supervised learning problem. The distribution of patient groups was initially analyzed using principle components analysis (PCA) to uncover potential nonlinear behavior. The performance of the different methods was evaluated using bivariate correlations and actuarial analysis. Over-fitting was controlled via cross-validation resampling. Our results suggest that a modified support vector machine (SVM) kernel method provided superior performance on leave-one-out testing compared to logistic regression and neural networks in cases where the data exhibited nonlinear behavior on PCA. For instance, in prediction of esophagitis and pneumonitis endpoints, which exhibited nonlinear behavior on PCA, the method provided 21% and 60% improvements, respectively. Furthermore, evaluation on the independent pneumonitis RTOG dataset demonstrated good generalizabilty beyond institutional data in contrast with other models. This indicates that the prediction of treatment response can be improved by utilizing nonlinear kernel methods for discovering important nonlinear interactions among model

  9. Hybrid mesh generation using advancing reduction technique

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This study presents an extension of the application of the advancing reduction technique to the hybrid mesh generation. The proposed algorithm is based on a pre-generated rectangle mesh (RM) with a certain orientation. The intersection points between the two sets of perpendicular mesh lines in RM an...

  10. Advanced techniques for future observations from space

    NASA Technical Reports Server (NTRS)

    Hinkley, E. D.

    1980-01-01

    Advanced remote sensing techniques for the study of global meteorology and the chemistry of the atmosphere are considered. Remote sensing from Spacelab/Shuttle and free-flying satellites will provide the platforms for instrumentation based on advanced technology. Several laser systems are being developed for the measurement of tropospheric winds and pressure, and trace species in the troposphere and stratosphere. In addition, a high-resolution passive infrared sensor shows promise for measuring temperature from sea level up through the stratosphere. Advanced optical and microwave instruments are being developed for wind measurements in the stratosphere and mesosphere. Microwave techniques are also useful for the study of meteorological parameters at the air-sea interface.

  11. Descriptive Statistical Techniques for Librarians. 2nd Edition.

    ERIC Educational Resources Information Center

    Hafner, Arthur W.

    A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…

  12. Advanced Computational Techniques in Regional Wave Studies

    DTIC Science & Technology

    1990-01-03

    the new GERESS data. The dissertation work emphasized the development and use of advanced computa- tional techniques for studying regional seismic...hand, the possibility of new data sources at regional distances permits using previously ignored signals. Unfortunately, these regional signals will...the Green’s function around this new reference point is containing the propagation effects, and V is the source Gnk(x,t;r,t) - (2) volume where fJk

  13. Techniques in teaching statistics : linking research production and research use.

    SciTech Connect

    Martinez-Moyano, I .; Smith, A.

    2012-01-01

    In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between research and practice.

  14. Advanced Tools and Techniques for Formal Techniques in Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    2005-01-01

    This is the final technical report for grant number NAG-1-02101. The title of this grant was "Advanced Tools and Techniques for Formal Techniques In Aerospace Systems". The principal investigator on this grant was Dr. John C. Knight of the Computer Science Department, University of Virginia, Charlottesville, Virginia 22904-4740. This report summarizes activities under the grant during the period 7/01/2002 to 9/30/2004. This report is organized as follows. In section 2, the technical background of the grant is summarized. Section 3 lists accomplishments and section 4 lists students funded under the grant. In section 5, we present a list of presentations given at various academic and research institutions about the research conducted. Finally, a list of publications generated under this grant is included in section 6.

  15. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  16. Advanced techniques in echocardiography in small animals.

    PubMed

    Chetboul, Valérie

    2010-07-01

    Transthoracic echocardiography has become a major imaging tool for the diagnosis and management of canine and feline cardiovascular diseases. During the last decade, more recent advances in ultrasound technology with the introduction of newer imaging modalities, such as tissue Doppler imaging, strain and strain rate imaging, and 2-dimensional speckle tracking echocardiography, have provided new parameters to assess myocardial performance, including regional myocardial velocities and deformation, ventricular twist, and mechanical synchrony. An outline of these 4 recent ultrasound techniques, their impact on the understanding of right and left ventricular function in small animals, and their application in research and clinical settings are given in this article.

  17. Basic concepts of advanced MRI techniques.

    PubMed

    Pagani, Elisabetta; Bizzi, Alberto; Di Salle, Francesco; De Stefano, Nicola; Filippi, Massimo

    2008-10-01

    An overview is given of magnetic resonance (MR) techniques sensitized to diffusion, flow, magnetization transfer effect, and local field inhomogeneities induced by physiological changes, that can be viewed, in the clinical practice, as advanced because of their challenging implementation and interpretation. These techniques are known as diffusion-weighted, perfusion, magnetization transfer, functional MRI and MR spectroscopy. An important issue is that they can provide quantitative estimates of structural and functional characteristics that are below the voxel resolution. This review does not deal with the basic concepts of the MR physics and the description of the available acquisition and postprocessing methods, but hopefully provides an adequate background to readers and hence facilitate the understanding of the following clinical contributions.

  18. Advanced flow MRI: emerging techniques and applications.

    PubMed

    Markl, M; Schnell, S; Wu, C; Bollache, E; Jarvis, K; Barker, A J; Robinson, J D; Rigsby, C K

    2016-08-01

    Magnetic resonance imaging (MRI) techniques provide non-invasive and non-ionising methods for the highly accurate anatomical depiction of the heart and vessels throughout the cardiac cycle. In addition, the intrinsic sensitivity of MRI to motion offers the unique ability to acquire spatially registered blood flow simultaneously with the morphological data, within a single measurement. In clinical routine, flow MRI is typically accomplished using methods that resolve two spatial dimensions in individual planes and encode the time-resolved velocity in one principal direction, typically oriented perpendicular to the two-dimensional (2D) section. This review describes recently developed advanced MRI flow techniques, which allow for more comprehensive evaluation of blood flow characteristics, such as real-time flow imaging, 2D multiple-venc phase contrast MRI, four-dimensional (4D) flow MRI, quantification of complex haemodynamic properties, and highly accelerated flow imaging. Emerging techniques and novel applications are explored. In addition, applications of these new techniques for the improved evaluation of cardiovascular (aorta, pulmonary arteries, congenital heart disease, atrial fibrillation, coronary arteries) as well as cerebrovascular disease (intra-cranial arteries and veins) are presented.

  19. Advances in Statistical Methods for Substance Abuse Prevention Research

    PubMed Central

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  20. Statistical Techniques for Efficient Indexing and Retrieval of Document Images

    ERIC Educational Resources Information Center

    Bhardwaj, Anurag

    2010-01-01

    We have developed statistical techniques to improve the performance of document image search systems where the intermediate step of OCR based transcription is not used. Previous research in this area has largely focused on challenges pertaining to generation of small lexicons for processing handwritten documents and enhancement of poor quality…

  1. Advanced techniques in current signature analysis

    NASA Astrophysics Data System (ADS)

    Smith, S. F.; Castleberry, K. N.

    1992-02-01

    In general, both ac and dc motors can be characterized as weakly nonlinear systems, in which both linear and nonlinear effects occur simultaneously. Fortunately, the nonlinearities are generally well behaved and understood and can be handled via several standard mathematical techniques already well developed in the systems modeling area; examples are piecewise linear approximations and Volterra series representations. Field measurements of numerous motors and motor-driven systems confirm the rather complex nature of motor current spectra and illustrate both linear and nonlinear effects (including line harmonics and modulation components). Although previous current signature analysis (CSA) work at Oak Ridge and other sites has principally focused on the modulation mechanisms and detection methods (AM, PM, and FM), more recent studies have been conducted on linear spectral components (those appearing in the electric current at their actual frequencies and not as modulation sidebands). For example, large axial-flow compressors (approximately 3300 hp) in the US gaseous diffusion uranium enrichment plants exhibit running-speed (approximately 20 Hz) and high-frequency vibrational information (greater than 1 kHz) in their motor current spectra. Several signal-processing techniques developed to facilitate analysis of these components, including specialized filtering schemes, are presented. Finally, concepts for the designs of advanced digitally based CSA units are offered, which should serve to foster the development of much more computationally capable 'smart' CSA instrumentation in the next several years.

  2. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    SciTech Connect

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errors in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis

  3. Fitting multidimensional splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    This report demonstrates the successful application of statistical variable selection techniques to fit splines. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs using the B-spline basis were developed, and the one for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  4. A Hierarchical Statistic Methodology for Advanced Memory System Evaluation

    SciTech Connect

    Sun, X.-J.; He, D.; Cameron, K.W.; Luo, Y.

    1999-04-12

    Advances in technology have resulted in a widening of the gap between computing speed and memory access time. Data access time has become increasingly important for computer system design. Various hierarchical memory architectures have been developed. The performance of these advanced memory systems, however, varies with applications and problem sizes. How to reach an optimal cost/performance design eludes researchers still. In this study, the authors introduce an evaluation methodology for advanced memory systems. This methodology is based on statistical factorial analysis and performance scalability analysis. It is two fold: it first determines the impact of memory systems and application programs toward overall performance; it also identifies the bottleneck in a memory hierarchy and provides cost/performance comparisons via scalability analysis. Different memory systems can be compared in terms of mean performance or scalability over a range of codes and problem sizes. Experimental testing has been performed extensively on the Department of Energy's Accelerated Strategic Computing Initiative (ASCI) machines and benchmarks available at the Los Alamos National Laboratory to validate this newly proposed methodology. Experimental and analytical results show this methodology is simple and effective. It is a practical tool for memory system evaluation and design. Its extension to general architectural evaluation and parallel computer systems are possible and should be further explored.

  5. Region merging techniques using information theory statistical measures.

    PubMed

    Calderero, Felipe; Marques, Ferran

    2010-06-01

    The purpose of the current work is to propose, under a statistical framework, a family of unsupervised region merging techniques providing a set of the most relevant region-based explanations of an image at different levels of analysis. These techniques are characterized by general and nonparametric region models, with neither color nor texture homogeneity assumptions, and a set of innovative merging criteria, based on information theory statistical measures. The scale consistency of the partitions is assured through i) a size regularization term into the merging criteria and a classical merging order, or ii) using a novel scale-based merging order to avoid the region size homogeneity imposed by the use of a size regularization term. Moreover, a partition significance index is defined to automatically determine the subset of most representative partitions from the created hierarchy. Most significant automatically extracted partitions show the ability to represent the semantic content of the image from a human point of view. Finally, a complete and exhaustive evaluation of the proposed techniques is performed, using not only different databases for the two main addressed problems (object-oriented segmentation of generic images and texture image segmentation), but also specific evaluation features in each case: under- and oversegmentation error, and a large set of region-based, pixel-based and error consistency indicators, respectively. Results are promising, outperforming in most indicators both object-oriented and texture state-of-the-art segmentation techniques.

  6. Statistical Modeling of Photovoltaic Reliability Using Accelerated Degradation Techniques (Poster)

    SciTech Connect

    Lee, J.; Elmore, R.; Jones, W.

    2011-02-01

    We introduce a cutting-edge life-testing technique, accelerated degradation testing (ADT), for PV reliability testing. The ADT technique is a cost-effective and flexible reliability testing method with multiple (MADT) and Step-Stress (SSADT) variants. In an environment with limited resources, including equipment (chambers), test units, and testing time, these techniques can provide statistically rigorous prediction of lifetime and other interesting parameters, such as failure rate, warranty time, mean time to failure, degradation rate, activation energy, acceleration factor, and upper limit level of stress. J-V characterization can be used for degradation data and the generalized Eyring model can be used for the thermal-humidity stress condition. The SSADT model can be constructed based on the cumulative damage model (CEM), which assumes that the remaining test united are failed according to cumulative density function of current stress level regardless of the history on previous stress levels.

  7. Combining heuristic and statistical techniques in landslide hazard assessments

    NASA Astrophysics Data System (ADS)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  8. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  9. Statistical Techniques for Assessing water‐quality effects of BMPs

    USGS Publications Warehouse

    Walker, John F.

    1994-01-01

    Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)

  10. Statistical optimisation techniques in fatigue signal editing problem

    SciTech Connect

    Nopiah, Z. M.; Osman, M. H.; Baharin, N.; Abdullah, S.

    2015-02-03

    Success in fatigue signal editing is determined by the level of length reduction without compromising statistical constraints. A great reduction rate can be achieved by removing small amplitude cycles from the recorded signal. The long recorded signal sometimes renders the cycle-to-cycle editing process daunting. This has encouraged researchers to focus on the segment-based approach. This paper discusses joint application of the Running Damage Extraction (RDE) technique and single constrained Genetic Algorithm (GA) in fatigue signal editing optimisation.. In the first section, the RDE technique is used to restructure and summarise the fatigue strain. This technique combines the overlapping window and fatigue strain-life models. It is designed to identify and isolate the fatigue events that exist in the variable amplitude strain data into different segments whereby the retention of statistical parameters and the vibration energy are considered. In the second section, the fatigue data editing problem is formulated as a constrained single optimisation problem that can be solved using GA method. The GA produces the shortest edited fatigue signal by selecting appropriate segments from a pool of labelling segments. Challenges arise due to constraints on the segment selection by deviation level over three signal properties, namely cumulative fatigue damage, root mean square and kurtosis values. Experimental results over several case studies show that the idea of solving fatigue signal editing within a framework of optimisation is effective and automatic, and that the GA is robust for constrained segment selection.

  11. Seasonal drought predictability in Portugal using statistical-dynamical techniques

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. F. S.; Pires, C. A. L.

    2016-08-01

    Atmospheric forecasting and predictability are important to promote adaption and mitigation measures in order to minimize drought impacts. This study estimates hybrid (statistical-dynamical) long-range forecasts of the regional drought index SPI (3-months) over homogeneous regions from mainland Portugal, based on forecasts from the UKMO operational forecasting system, with lead-times up to 6 months. ERA-Interim reanalysis data is used for the purpose of building a set of SPI predictors integrating recent past information prior to the forecast launching. Then, the advantage of combining predictors with both dynamical and statistical background in the prediction of drought conditions at different lags is evaluated. A two-step hybridization procedure is performed, in which both forecasted and observed 500 hPa geopotential height fields are subjected to a PCA in order to use forecasted PCs and persistent PCs as predictors. A second hybridization step consists on a statistical/hybrid downscaling to the regional SPI, based on regression techniques, after the pre-selection of the statistically significant predictors. The SPI forecasts and the added value of combining dynamical and statistical methods are evaluated in cross-validation mode, using the R2 and binary event scores. Results are obtained for the four seasons and it was found that winter is the most predictable season, and that most of the predictive power is on the large-scale fields from past observations. The hybridization improves the downscaling based on the forecasted PCs, since they provide complementary information (though modest) beyond that of persistent PCs. These findings provide clues about the predictability of the SPI, particularly in Portugal, and may contribute to the predictability of crops yields and to some guidance on users (such as farmers) decision making process.

  12. Comparison of Three Statistical Classification Techniques for Maser Identification

    NASA Astrophysics Data System (ADS)

    Manning, Ellen M.; Holland, Barbara R.; Ellingsen, Simon P.; Breen, Shari L.; Chen, Xi; Humphries, Melissa

    2016-04-01

    We applied three statistical classification techniques-linear discriminant analysis (LDA), logistic regression, and random forests-to three astronomical datasets associated with searches for interstellar masers. We compared the performance of these methods in identifying whether specific mid-infrared or millimetre continuum sources are likely to have associated interstellar masers. We also discuss the interpretability of the results of each classification technique. Non-parametric methods have the potential to make accurate predictions when there are complex relationships between critical parameters. We found that for the small datasets the parametric methods logistic regression and LDA performed best, for the largest dataset the non-parametric method of random forests performed with comparable accuracy to parametric techniques, rather than any significant improvement. This suggests that at least for the specific examples investigated here accuracy of the predictions obtained is not being limited by the use of parametric models. We also found that for LDA, transformation of the data to match a normal distribution led to a significant improvement in accuracy. The different classification techniques had significant overlap in their predictions; further astronomical observations will enable the accuracy of these predictions to be tested.

  13. Bringing Advanced Computational Techniques to Energy Research

    SciTech Connect

    Mitchell, Julie C

    2012-11-17

    Please find attached our final technical report for the BACTER Institute award. BACTER was created as a graduate and postdoctoral training program for the advancement of computational biology applied to questions of relevance to bioenergy research.

  14. Aesthetic Lateral Canthoplasty Using Tarso-Conjunctival Advancement Technique.

    PubMed

    Lee, Eun Jung; Lew, Dae Hyun; Song, Seung Han; Lee, Myung Chul

    2017-01-01

    Reduced horizontal length of the palpebral fissure is a distinctive characteristic of Asian eyelids, and aesthetic lateral canthal lengthening techniques have been performed for a refinement. The aim of this study is to describe a novel lateral canthoplasty using tarso-conjunctival advancement with a lid margin splitting procedure on the upper eyelids and to report the postoperative results. From December 2011 to June 2014, patients who underwent lateral canthoplasty using the tarso-conjunctival advancement procedure for aesthetic purposes were reviewed retrospectively. The predictor variables were grouped into demographic and operative categories. The primary outcome variables were the distances from the mid-pupillary line to the lateral canthus and the horizontal length of the palpebral aperture (distance from the medial to lateral canthus). Data analyses were performed using descriptive and univariate statistics. Patients who showed increment in objective measurements were considered significant. Aesthetic appearance was also evaluated based on pre- and postoperative clinical photographs. A total of 45 patients were enrolled in this study. Both the distance from the mid-pupil to the lateral canthus (ΔDpupil-lateral; 2.78 ± 0.54 mm, P <0.05) and the palpebral aperture horizontal length (ΔDmedial-lateral 2.93 ± 0.81 mm, P <0.05) increased significantly from pre- to postoperative state. All the patients demonstrated satisfactory results aesthetically during the follow-up. The tarso-conjunctival advancement technique for lateral canthoplasty produced satisfactory aesthetic results with an enlarged palpebral aperture. Future research is required to fully delineate the risk of possible complications, including injury to the eyelashes and meibomian glands.

  15. The statistical analysis techniques to support the NGNP fuel performance experiments

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  16. The statistical analysis techniques to support the NGNP fuel performance experiments

    NASA Astrophysics Data System (ADS)

    Pham, Binh T.; Einerson, Jeffrey J.

    2013-10-01

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  17. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains.

  18. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  19. Advanced crystallization techniques of 'solar grade' silicon

    NASA Astrophysics Data System (ADS)

    Gasparini, M.; Calligarich, C.; Rava, P.; Sardi, L.; Alessandri, M.; Redaelli, F.; Pizzini, S.

    Microstructural, electrical and photo-voltaic characteristics of polycrystal line silicon solar cells fabricated with silicon ingots containing 5, 100 and 500 ppmw iron are reported and discussed. All silicon ingots were grown by the directional solidification technique in graphite or special quartz molds and doped intentionally with iron, in order to evaluate the potentiality of the D.S. technique when employed with solar silicon feedstocks. Results indicate that structural breakdown limits the amount of the ingot which is usable for solar cells fabrication, but also that efficiencies in excess of 10 percent are obtained using the 'good' region of the ingot.

  20. Advances in laparoscopic urologic surgery techniques

    PubMed Central

    Abdul-Muhsin, Haidar M.; Humphreys, Mitchell R.

    2016-01-01

    The last two decades witnessed the inception and exponential implementation of key technological advancements in laparoscopic urology. While some of these technologies thrived and became part of daily practice, others are still hindered by major challenges. This review was conducted through a comprehensive literature search in order to highlight some of the most promising technologies in laparoscopic visualization, augmented reality, and insufflation. Additionally, this review will provide an update regarding the current status of single-site and natural orifice surgery in urology. PMID:27134743

  1. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing disease resistance data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The mixed linear model (MLM) is currently among the most advanced and flexible statistical modeling techniques and its use in tackling problems in plant pathology has begun surfacing in the literature. The longitudinal MLM is a multivariate extension that handles repeatedly measured data, such as r...

  2. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  3. Source apportionment advances using polar plots of bivariate correlation and regression statistics

    NASA Astrophysics Data System (ADS)

    Grange, Stuart K.; Lewis, Alastair C.; Carslaw, David C.

    2016-11-01

    This paper outlines the development of enhanced bivariate polar plots that allow the concentrations of two pollutants to be compared using pair-wise statistics for exploring the sources of atmospheric pollutants. The new method combines bivariate polar plots, which provide source characteristic information, with pair-wise statistics that provide information on how two pollutants are related to one another. The pair-wise statistics implemented include weighted Pearson correlation and slope from two linear regression methods. The development uses a Gaussian kernel to locally weight the statistical calculations on a wind speed-direction surface together with variable-scaling. Example applications of the enhanced polar plots are presented by using routine air quality data for two monitoring sites in London, United Kingdom for a single year (2013). The London examples demonstrate that the combination of bivariate polar plots, correlation, and regression techniques can offer considerable insight into air pollution source characteristics, which would be missed if only scatter plots and mean polar plots were used for analysis. Specifically, using correlation and slopes as pair-wise statistics, long-range transport processes were isolated and black carbon (BC) contributions to PM2.5 for a kerbside monitoring location were quantified. Wider applications and future advancements are also discussed.

  4. Metrology optical power budgeting in SIM using statistical analysis techniques

    NASA Astrophysics Data System (ADS)

    Kuan, Gary M.

    2008-07-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  5. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  6. Advanced analysis techniques for uranium assay

    SciTech Connect

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.; Beard, C. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  7. Recent advances in DNA sequencing techniques

    NASA Astrophysics Data System (ADS)

    Singh, Rama Shankar

    2013-06-01

    Successful mapping of the draft human genome in 2001 and more recent mapping of the human microbiome genome in 2012 have relied heavily on the parallel processing of the second generation/Next Generation Sequencing (NGS) DNA machines at a cost of several millions dollars and long computer processing times. These have been mainly biochemical approaches. Here a system analysis approach is used to review these techniques by identifying the requirements, specifications, test methods, error estimates, repeatability, reliability and trends in the cost reduction. The first generation, NGS and the Third Generation Single Molecule Real Time (SMART) detection sequencing methods are reviewed. Based on the National Human Genome Research Institute (NHGRI) data, the achieved cost reduction of 1.5 times per yr. from Sep. 2001 to July 2007; 7 times per yr., from Oct. 2007 to Apr. 2010; and 2.5 times per yr. from July 2010 to Jan 2012 are discussed.

  8. Diagnostics of nonlocal plasmas: advanced techniques

    NASA Astrophysics Data System (ADS)

    Mustafaev, Alexander; Grabovskiy, Artiom; Strakhova, Anastasiya; Soukhomlinov, Vladimir

    2014-10-01

    This talk generalizes our recent results, obtained in different directions of plasma diagnostics. First-method of flat single-sided probe, based on expansion of the electron velocity distribution function (EVDF) in series of Legendre polynomials. It will be demonstrated, that flat probe, oriented under different angles with respect to the discharge axis, allow to determine full EVDF in nonlocal plasmas. It is also shown, that cylindrical probe is unable to determine full EVDF. We propose the solution of this problem by combined using the kinetic Boltzmann equation and experimental probe data. Second-magnetic diagnostics. This method is implemented in knudsen diode with surface ionization of atoms (KDSI) and based on measurements of the magnetic characteristics of the KDSI in presence of transverse magnetic field. Using magnetic diagnostics we can investigate the wide range of plasma processes: from scattering cross-sections of electrons to plasma-surface interactions. Third-noncontact diagnostics method for direct measurements of EVDF in remote plasma objects by combination of the flat single-sided probe technique and magnetic polarization Hanley method.

  9. Innovative Tools Advance Revolutionary Weld Technique

    NASA Technical Reports Server (NTRS)

    2009-01-01

    The iconic, orange external tank of the space shuttle launch system not only contains the fuel used by the shuttle s main engines during liftoff but also comprises the shuttle s backbone, supporting the space shuttle orbiter and solid rocket boosters. Given the tank s structural importance and the extreme forces (7.8 million pounds of thrust load) and temperatures it encounters during launch, the welds used to construct the tank must be highly reliable. Variable polarity plasma arc welding, developed for manufacturing the external tank and later employed for building the International Space Station, was until 1994 the best process for joining the aluminum alloys used during construction. That year, Marshall Space Flight Center engineers began experimenting with a relatively new welding technique called friction stir welding (FSW), developed in 1991 by The Welding Institute, of Cambridge, England. FSW differs from traditional fusion welding in that it is a solid-state welding technique, using frictional heat and motion to join structural components without actually melting any of the material. The weld is created by a shouldered pin tool that is plunged into the seam of the materials to be joined. The tool traverses the line while rotating at high speeds, generating friction that heats and softens but does not melt the metal. (The heat produced approaches about 80 percent of the metal s melting temperature.) The pin tool s rotation crushes and stirs the plasticized metal, extruding it along the seam as the tool moves forward. The material cools and consolidates, resulting in a weld with superior mechanical properties as compared to those weld properties of fusion welds. The innovative FSW technology promises a number of attractive benefits. Because the welded materials are not melted, many of the undesirables associated with fusion welding porosity, cracking, shrinkage, and distortion of the weld are minimized or avoided. The process is more energy efficient, safe

  10. APPLICATION OF STATISTICAL SMOOTHING TECHNIQUES TO INERTIAL NAVIGATION.

    DTIC Science & Technology

    INERTIAL NAVIGATION , CORRECTIONS), ERRORS, STATISTICAL ANALYSIS, NUMERICAL ANALYSIS, APPROXIMATION(MATHEMATICS), GYROSCOPES, ACCELEROMETERS, DRIFT, SHIPBOARD, VELOCITY, TIME, NOISE, MATRICES(MATHEMATICS).

  11. Advances in gamma titanium aluminides and their manufacturing techniques

    NASA Astrophysics Data System (ADS)

    Kothari, Kunal; Radhakrishnan, Ramachandran; Wereley, Norman M.

    2012-11-01

    Gamma titanium aluminides display attractive properties for high temperature applications. For over a decade in the 1990s, the attractive properties of titanium aluminides were outweighed by difficulties encountered in processing and machining at room temperature. But advances in manufacturing technologies, deeper understanding of titanium aluminides microstructure, deformation mechanisms, and advances in micro-alloying, has led to the production of gamma titanium aluminide sheets. An in-depth review of key advances in gamma titanium aluminides is presented, including microstructure, deformation mechanisms, and alloy development. Traditional manufacturing techniques such as ingot metallurgy and investment casting are reviewed and advances via powder metallurgy based manufacturing techniques are discussed. Finally, manufacturing challenges facing gamma titanium aluminides, as well as avenues to overcome them, are discussed.

  12. 75 FR 44015 - Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... COMMISSION Certain Semiconductor Products Made by Advanced Lithography Techniques and Products Containing... importation of certain semiconductor products made by advanced lithography techniques and products containing... certain semiconductor products made by advanced lithography techniques or products containing same...

  13. Advanced liner-cooling techniques for gas turbine combustors

    NASA Technical Reports Server (NTRS)

    Norgren, C. T.; Riddlebaugh, S. M.

    1985-01-01

    Component research for advanced small gas turbine engines is currently underway at the NASA Lewis Research Center. As part of this program, a basic reverse-flow combustor geometry was being maintained while different advanced liner wall cooling techniques were investigated. Performance and liner cooling effectiveness of the experimental combustor configuration featuring counter-flow film-cooled panels is presented and compared with two previously reported combustors featuring: splash film-cooled liner walls; and transpiration cooled liner walls (Lamilloy).

  14. Correlation techniques and measurements of wave-height statistics

    NASA Technical Reports Server (NTRS)

    Guthart, H.; Taylor, W. C.; Graf, K. A.; Douglas, D. G.

    1972-01-01

    Statistical measurements of wave height fluctuations have been made in a wind wave tank. The power spectral density function of temporal wave height fluctuations evidenced second-harmonic components and an f to the minus 5th power law decay beyond the second harmonic. The observations of second harmonic effects agreed very well with a theoretical prediction. From the wave statistics, surface drift currents were inferred and compared to experimental measurements with satisfactory agreement. Measurements were made of the two dimensional correlation coefficient at 15 deg increments in angle with respect to the wind vector. An estimate of the two-dimensional spatial power spectral density function was also made.

  15. Understanding Summary Statistics and Graphical Techniques to Compare Michael Jordan versus LeBron James

    ERIC Educational Resources Information Center

    Williams, Immanuel James; Williams, Kelley Kim

    2016-01-01

    Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.

  16. Statistical inference to advance network models in epidemiology.

    PubMed

    Welch, David; Bansal, Shweta; Hunter, David R

    2011-03-01

    Contact networks are playing an increasingly important role in the study of epidemiology. Most of the existing work in this area has focused on considering the effect of underlying network structure on epidemic dynamics by using tools from probability theory and computer simulation. This work has provided much insight on the role that heterogeneity in host contact patterns plays on infectious disease dynamics. Despite the important understanding afforded by the probability and simulation paradigm, this approach does not directly address important questions about the structure of contact networks such as what is the best network model for a particular mode of disease transmission, how parameter values of a given model should be estimated, or how precisely the data allow us to estimate these parameter values. We argue that these questions are best answered within a statistical framework and discuss the role of statistical inference in estimating contact networks from epidemiological data.

  17. Integration of Advanced Statistical Analysis Tools and Geophysical Modeling

    DTIC Science & Technology

    2010-12-01

    later in this section. 2) San Luis Obispo . Extracted features were also provided for MTADS EM61, MTADS magnetics, EM61 cart, and TEMTADS data sets from...subsequent training of statistical classifiers using these features. Results of discrimination studies at Camp Sibert and San Luis Obispo have shown...Comparison of classification performance Figures 10 through 13 show receiver operating characteristics for data sets acquired at San Luis Obispo . Subplot

  18. A statistical calibration technique for thermochromic liquid crystals

    NASA Astrophysics Data System (ADS)

    Roesgen, T.; Totaro, R.

    2002-09-01

    A novel approach is proposed for the color calibration of thermochromic liquid crystals. Based on a statistical interpretation, a linear transform of the native (R, G, B) values can replace the customary hue mapping. The transform coefficients are computed through a proper orthogonal decomposition, providing complete data decorrelation and optimal information compression.

  19. Wafer hot spot identification through advanced photomask characterization techniques

    NASA Astrophysics Data System (ADS)

    Choi, Yohan; Green, Michael; McMurran, Jeff; Ham, Young; Lin, Howard; Lan, Andy; Yang, Richer; Lung, Mike

    2016-10-01

    As device manufacturers progress through advanced technology nodes, limitations in standard 1-dimensional (1D) mask Critical Dimension (CD) metrics are becoming apparent. Historically, 1D metrics such as Mean to Target (MTT) and CD Uniformity (CDU) have been adequate for end users to evaluate and predict the mask impact on the wafer process. However, the wafer lithographer's process margin is shrinking at advanced nodes to a point that the classical mask CD metrics are no longer adequate to gauge the mask contribution to wafer process error. For example, wafer CDU error at advanced nodes is impacted by mask factors such as 3-dimensional (3D) effects and mask pattern fidelity on subresolution assist features (SRAFs) used in Optical Proximity Correction (OPC) models of ever-increasing complexity. These items are not quantifiable with the 1D metrology techniques of today. Likewise, the mask maker needs advanced characterization methods in order to optimize the mask process to meet the wafer lithographer's needs. These advanced characterization metrics are what is needed to harmonize mask and wafer processes for enhanced wafer hot spot analysis. In this paper, we study advanced mask pattern characterization techniques and their correlation with modeled wafer performance.

  20. Bringing The Web Down to Size: Advanced Search Techniques.

    ERIC Educational Resources Information Center

    Huber, Joe; Miley, Donna

    1997-01-01

    Examines advanced Internet search techniques, focusing on six search engines. Includes a chart comparison of nine search features: "include two words,""exclude one of two words,""exclude mature audience content,""two adjacent words,""exact match,""contains first and neither of two following…

  1. Advanced Marketing Core Curriculum. Test Items and Assessment Techniques.

    ERIC Educational Resources Information Center

    Smith, Clifton L.; And Others

    This document contains duties and tasks, multiple-choice test items, and other assessment techniques for Missouri's advanced marketing core curriculum. The core curriculum begins with a list of 13 suggested textbook resources. Next, nine duties with their associated tasks are given. Under each task appears one or more citations to appropriate…

  2. Genioglossus muscle advancement: A modification of the conventional technique.

    PubMed

    García Vega, José Ramón; de la Plata, María Mancha; Galindo, Néstor; Navarro, Miriam; Díez, Daniel; Láncara, Fernando

    2014-04-01

    Obstructive sleep apnoea syndrome (OSAS) is a pathophysiologic condition associated with fragmented sleep and arousals caused by nocturnal mechanical obstruction of the upper airway. This results in behavioural derangements, such as excessive daytime sleepiness and fatigue, and pathophysiologic derangements that cause morbidities and mortality including hypertension, arrhythmias, myocardial infarction, stroke and sudden death. The genioglossus advancement is a proven technique for the treatment of mild to moderate obstructive sleep apnoea syndrome by relieving airway obstruction at the hypopharyngeal level. In this article, we report a modification of the conventional genioglossus advancement described by Riley and Powell. The modification we describe replaces the bone segment at the mandibular basal bone rather than at the mid area of the symphysis. This means a linear movement that allows a greater advancement and avoids the rotation of the genioglossus muscle. Through this article we will describe the advantages of the surgical technique such as greater effectiveness, stability, more pleasing aesthetic outcome and the reduction of potential complications.

  3. Statistical Analysis of speckle noise reduction techniques for echocardiographic Images

    NASA Astrophysics Data System (ADS)

    Saini, Kalpana; Dewal, M. L.; Rohit, Manojkumar

    2011-12-01

    Echocardiography is the safe, easy and fast technology for diagnosing the cardiac diseases. As in other ultrasound images these images also contain speckle noise. In some cases this speckle noise is useful such as in motion detection. But in general noise removal is required for better analysis of the image and proper diagnosis. Different Adaptive and anisotropic filters are included for statistical analysis. Statistical parameters such as Signal-to-Noise Ratio (SNR), Peak Signal-to-Noise Ratio (PSNR), and Root Mean Square Error (RMSE) calculated for performance measurement. One more important aspect that there may be blurring during speckle noise removal. So it is prefered that filter should be able to enhance edges during noise removal.

  4. The nano-mechanical signature of Ultra High Performance Concrete by statistical nanoindentation techniques

    SciTech Connect

    Sorelli, Luca Constantinides, Georgios; Ulm, Franz-Josef; Toutlemonde, Francois

    2008-12-15

    Advances in engineering the microstructure of cementitious composites have led to the development of fiber reinforced Ultra High Performance Concretes (UHPC). The scope of this paper is twofold, first to characterize the nano-mechanical properties of the phases governing the UHPC microstructure by means of a novel statistical nanoindentation technique; then to upscale those nanoscale properties, by means of continuum micromechanics, to the macroscopic scale of engineering applications. In particular, a combined investigation of nanoindentation, scanning electron microscope (SEM) and X-ray Diffraction (XRD) indicates that the fiber-matrix transition zone is relatively defect free. On this basis, a four-level multiscale model with defect free interfaces allows to accurately determine the composite stiffness from the measured nano-mechanical properties. Besides evidencing the dominant role of high density calcium silicate hydrates and the stiffening effect of residual clinker, the suggested model may become a useful tool for further optimizing cement-based engineered composites.

  5. Advanced techniques in safeguarding a conditioning facility for spent fuel

    SciTech Connect

    Rudolf, K.; Weh, R. )

    1992-01-01

    Although reprocessing continues to be the main factor in the waste management of nuclear reactors, the alternative of direct final disposal is currently being developed to the level of industrial applications, based on an agreement between the heads of the federal government and the federal states of Germany. Thus, the Konrad and Gorleben sites are being studied as potential final repositories as is the pilot conditioning facility (PKA) under construction. Discussions on the application of safeguards measures have led to the drafting of an approach that will cover the entire back end of the fuel cycle. The conditioning of fuel prior to direct final disposal represents one element in the overall approach. A modern facility equipped with advanced technology, PKA is a pilot plant with regard to conditioning techniques as well as to safeguards. Therefore, the PKA safeguards approach is expected to facilitate future industrial applications of the conditioning procedure. This cannot be satisfactorily implemented without advanced safeguards techniques. The level of development of the safeguards techniques varies. While advanced camera and seal systems are basically available, the other techniques and methods still require research and development. Feasibility studies and equipment development are geared to providing applicable safeguards techniques in time for commissioning of the PKA.

  6. An overview of statistical decomposition techniques applied to complex systems

    PubMed Central

    Tuncer, Yalcin; Tanik, Murat M.; Allison, David B.

    2009-01-01

    The current state of the art in applied decomposition techniques is summarized within a comparative uniform framework. These techniques are classified by the parametric or information theoretic approaches they adopt. An underlying structural model common to all parametric approaches is outlined. The nature and premises of a typical information theoretic approach are stressed. Some possible application patterns for an information theoretic approach are illustrated. Composition is distinguished from decomposition by pointing out that the former is not a simple reversal of the latter. From the standpoint of application to complex systems, a general evaluation is provided. PMID:19724659

  7. Statistical techniques for the characterization of partially observed epidemics.

    SciTech Connect

    Safta, Cosmin; Ray, Jaideep; Crary, David; Cheng, Karen

    2010-11-01

    Techniques appear promising to construct and integrate automated detect-and-characterize technique for epidemics - Working off biosurveillance data, and provides information on the particular/ongoing outbreak. Potential use - in crisis management and planning, resource allocation - Parameter estimation capability ideal for providing the input parameters into an agent-based model, Index Cases, Time of Infection, infection rate. Non-communicable diseases are easier than communicable ones - Small anthrax can be characterized well with 7-10 days of data, post-detection; plague takes longer, Large attacks are very easy.

  8. Using Classroom Assessment Techniques in an Introductory Statistics Class

    ERIC Educational Resources Information Center

    Goldstein, Gary S.

    2007-01-01

    College instructors often provide students with only summative evaluations of their work, typically in the form of exam scores or paper grades. Formative evaluation, such as classroom assessment techniques (CATs), are rarer in higher education and provide an ongoing evaluation of students' progress. In this article, the author summarizes the use…

  9. The statistical multifragmentation model: Origins and recent advances

    NASA Astrophysics Data System (ADS)

    Donangelo, R.; Souza, S. R.

    2016-07-01

    We review the Statistical Multifragmentation Model (SMM) which considers a generalization of the liquid-drop model for hot nuclei and allows one to calculate thermodynamic quantities characterizing the nuclear ensemble at the disassembly stage. We show how to determine probabilities of definite partitions of finite nuclei and how to determine, through Monte Carlo calculations, observables such as the caloric curve, multiplicity distributions, heat capacity, among others. Some experimental measurements of the caloric curve confirmed the SMM predictions of over 10 years before, leading to a surge in the interest in the model. However, the experimental determination of the fragmentation temperatures relies on the yields of different isotopic species, which were not correctly calculated in the schematic, liquid-drop picture, employed in the SMM. This led to a series of improvements in the SMM, in particular to the more careful choice of nuclear masses and energy densities, specially for the lighter nuclei. With these improvements the SMM is able to make quantitative determinations of isotope production. We show the application of SMM to the production of exotic nuclei through multifragmentation. These preliminary calculations demonstrate the need for a careful choice of the system size and excitation energy to attain maximum yields.

  10. Some Bayesian statistical techniques useful in estimating frequency and density

    USGS Publications Warehouse

    Johnson, D.H.

    1977-01-01

    This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.

  11. Advanced Packaging Materials and Techniques for High Power TR Module: Standard Flight vs. Advanced Packaging

    NASA Technical Reports Server (NTRS)

    Hoffman, James Patrick; Del Castillo, Linda; Miller, Jennifer; Jenabi, Masud; Hunter, Donald; Birur, Gajanana

    2011-01-01

    The higher output power densities required of modern radar architectures, such as the proposed DESDynI [Deformation, Ecosystem Structure, and Dynamics of Ice] SAR [Synthetic Aperture Radar] Instrument (or DSI) require increasingly dense high power electronics. To enable these higher power densities, while maintaining or even improving hardware reliability, requires advances in integrating advanced thermal packaging technologies into radar transmit/receive (TR) modules. New materials and techniques have been studied and compared to standard technologies.

  12. Geochemical portray of the Pacific Ridge: New isotopic data and statistical techniques

    NASA Astrophysics Data System (ADS)

    Hamelin, Cédric; Dosso, Laure; Hanan, Barry B.; Moreira, Manuel; Kositsky, Andrew P.; Thomas, Marion Y.

    2011-02-01

    Samples collected during the PACANTARCTIC 2 cruise fill a sampling gap from 53° to 41° S along the Pacific Antarctic Ridge (PAR). Analysis of Sr, Nd, Pb, Hf, and He isotope compositions of these new samples is shown together with published data from 66°S to 53°S and from the EPR. The recent advance in analytical mass spectrometry techniques generates a spectacular increase in the number of multidimensional isotopic data for oceanic basalts. Working with such multidimensional datasets generates a new approach for the data interpretation, preferably based on statistical analysis techniques. Principal Component Analysis (PCA) is a powerful mathematical tool to study this type of datasets. The purpose of PCA is to reduce the number of dimensions by keeping only those characteristics that contribute most to its variance. Using this technique, it becomes possible to have a statistical picture of the geochemical variations along the entire Pacific Ridge from 70°S to 10°S. The incomplete sampling of the ridge led previously to the identification of a large-scale division of the south Pacific mantle at the latitude of Easter Island. The PCA method applied here to the completed dataset reveals a different geochemical profile. Along the Pacific Ridge, a large-scale bell-shaped variation with an extremum at about 38°S of latitude is interpreted as a progressive change in the geochemical characteristics of the depleted matrix of the mantle. This Pacific Isotopic Bump (PIB) is also noticeable in the He isotopic ratio along-axis variation. The linear correlation observed between He and heavy radiogenic isotopes, together with the result of the PCA calculation, suggests that the large-scale variation is unrelated to the plume-ridge interactions in the area and should rather be attributed to the partial melting of a marble-cake assemblage.

  13. PREFACE: Advanced many-body and statistical methods in mesoscopic systems

    NASA Astrophysics Data System (ADS)

    Anghel, Dragos Victor; Sabin Delion, Doru; Sorin Paraoanu, Gheorghe

    2012-02-01

    It has increasingly been realized in recent times that the borders separating various subfields of physics are largely artificial. This is the case for nanoscale physics, physics of lower-dimensional systems and nuclear physics, where the advanced techniques of many-body theory developed in recent times could provide a unifying framework for these disciplines under the general name of mesoscopic physics. Other fields, such as quantum optics and quantum information, are increasingly using related methods. The 6-day conference 'Advanced many-body and statistical methods in mesoscopic systems' that took place in Constanta, Romania, between 27 June and 2 July 2011 was, we believe, a successful attempt at bridging an impressive list of topical research areas: foundations of quantum physics, equilibrium and non-equilibrium quantum statistics/fractional statistics, quantum transport, phases and phase transitions in mesoscopic systems/superfluidity and superconductivity, quantum electromechanical systems, quantum dissipation, dephasing, noise and decoherence, quantum information, spin systems and their dynamics, fundamental symmetries in mesoscopic systems, phase transitions, exactly solvable methods for mesoscopic systems, various extension of the random phase approximation, open quantum systems, clustering, decay and fission modes and systematic versus random behaviour of nuclear spectra. This event brought together participants from seventeen countries and five continents. Each of the participants brought considerable expertise in his/her field of research and, at the same time, was exposed to the newest results and methods coming from the other, seemingly remote, disciplines. The talks touched on subjects that are at the forefront of topical research areas and we hope that the resulting cross-fertilization of ideas will lead to new, interesting results from which everybody will benefit. We are grateful for the financial and organizational support from IFIN-HH, Ovidius

  14. Computing aerodynamic sound using advanced statistical turbulence theories

    NASA Technical Reports Server (NTRS)

    Hecht, A. M.; Teske, M. E.; Bilanin, A. J.

    1981-01-01

    It is noted that the calculation of turbulence-generated aerodynamic sound requires knowledge of the spatial and temporal variation of Q sub ij (xi sub k, tau), the two-point, two-time turbulent velocity correlations. A technique is presented to obtain an approximate form of these correlations based on closure of the Reynolds stress equations by modeling of higher order terms. The governing equations for Q sub ij are first developed for a general flow. The case of homogeneous, stationary turbulence in a unidirectional constant shear mean flow is then assumed. The required closure form for Q sub ij is selected which is capable of qualitatively reproducing experimentally observed behavior. This form contains separation time dependent scale factors as parameters and depends explicitly on spatial separation. The approximate forms of Q sub ij are used in the differential equations and integral moments are taken over the spatial domain. The velocity correlations are used in the Lighthill theory of aerodynamic sound by assuming normal joint probability.

  15. Technology development of fabrication techniques for advanced solar dynamic concentrators

    NASA Technical Reports Server (NTRS)

    Richter, Scott W.

    1991-01-01

    The objective of the advanced concentrator program is to develop the technology that will lead to lightweight, highly reflective, accurate, scaleable, and long lived space solar dynamic concentrators. The advanced concentrator program encompasses new and innovative concepts, fabrication techniques, materials selection, and simulated space environmental testing. Fabrication techniques include methods of fabricating the substrates and coating substrate surfaces to produce high quality optical surfaces, acceptable for further coating with vapor deposited optical films. The selected materials to obtain a high quality optical surface include microsheet glass and Eccocoat EP-3 epoxy, with DC-93-500 selected as a candidate silicone adhesive and levelizing layer. The following procedures are defined: cutting, cleaning, forming, and bonding microsheet glass. Procedures are also defined for surface cleaning, and EP-3 epoxy application. The results and analyses from atomic oxygen and thermal cycling tests are used to determine the effects of orbital conditions in a space environment.

  16. Regularized versus non-regularized statistical reconstruction techniques

    NASA Astrophysics Data System (ADS)

    Denisova, N. V.

    2011-08-01

    An important feature of positron emission tomography (PET) and single photon emission computer tomography (SPECT) is the stochastic property of real clinical data. Statistical algorithms such as ordered subset-expectation maximization (OSEM) and maximum a posteriori (MAP) are a direct consequence of the stochastic nature of the data. The principal difference between these two algorithms is that OSEM is a non-regularized approach, while the MAP is a regularized algorithm. From the theoretical point of view, reconstruction problems belong to the class of ill-posed problems and should be considered using regularization. Regularization introduces an additional unknown regularization parameter into the reconstruction procedure as compared with non-regularized algorithms. However, a comparison of non-regularized OSEM and regularized MAP algorithms with fixed regularization parameters has shown very minor difference between reconstructions. This problem is analyzed in the present paper. To improve the reconstruction quality, a method of local regularization is proposed based on the spatially adaptive regularization parameter. The MAP algorithm with local regularization was tested in reconstruction of the Hoffman brain phantom.

  17. Advanced thermal management techniques for space power electronics

    NASA Astrophysics Data System (ADS)

    Reyes, Angel Samuel

    1992-01-01

    Modern electronic systems used in space must be reliable and efficient with thermal management unaffected by outer space constraints. Current thermal management techniques are not sufficient for the increasing waste heat dissipation of novel electronic technologies. Many advanced thermal management techniques have been developed in recent years that have application in high power electronic systems. The benefits and limitations of emerging cooling technologies are discussed. These technologies include: liquid pumped devices, mechanically pumped two-phase cooling, capillary pumped evaporative cooling, and thermoelectric devices. Currently, liquid pumped devices offer the most promising alternative for electronics thermal control.

  18. Data Compression Techniques for Advanced Space Transportation Systems

    NASA Technical Reports Server (NTRS)

    Bradley, William G.

    1998-01-01

    Advanced space transportation systems, including vehicle state of health systems, will produce large amounts of data which must be stored on board the vehicle and or transmitted to the ground and stored. The cost of storage or transmission of the data could be reduced if the number of bits required to represent the data is reduced by the use of data compression techniques. Most of the work done in this study was rather generic and could apply to many data compression systems, but the first application area to be considered was launch vehicle state of health telemetry systems. Both lossless and lossy compression techniques were considered in this study.

  19. Advance techniques for monitoring human tolerance to positive Gz accelerations

    NASA Technical Reports Server (NTRS)

    Pelligra, R.; Sandler, H.; Rositano, S.; Skrettingland, K.; Mancini, R.

    1973-01-01

    Tolerance to positive g accelerations was measured in ten normal male subjects using both standard and advanced techniques. In addition to routine electrocardiogram, heart rate, respiratory rate, and infrared television, monitoring techniques during acceleration exposure included measurement of peripheral vision loss, noninvasive temporal, brachial, and/or radial arterial blood flow, and automatic measurement of indirect systolic and diastolic blood pressure at 60-sec intervals. Although brachial and radial arterial flow measurements reflected significant cardiovascular changes during and after acceleration, they were inconsistent indices of the onset of grayout or blackout. Temporal arterial blood flow, however, showed a high correlation with subjective peripheral light loss.

  20. Three-dimensional hybrid grid generation using advancing front techniques

    NASA Technical Reports Server (NTRS)

    Steinbrenner, John P.; Noack, Ralph W.

    1995-01-01

    A new 3-dimensional hybrid grid generation technique has been developed, based on ideas of advancing fronts for both structured and unstructured grids. In this approach, structured grids are first generate independently around individual components of the geometry. Fronts are initialized on these structure grids, and advanced outward so that new cells are extracted directly from the structured grids. Employing typical advancing front techniques, cells are rejected if they intersect the existing front or fail other criteria When no more viable structured cells exist further cells are advanced in an unstructured manner to close off the overall domain, resulting in a grid of 'hybrid' form. There are two primary advantages to the hybrid formulation. First, generating blocks with limited regard to topology eliminates the bottleneck encountered when a multiple block system is used to fully encapsulate a domain. Individual blocks may be generated free of external constraints, which will significantly reduce the generation time. Secondly, grid points near the body (presumably with high aspect ratio) will still maintain a structured (non-triangular or tetrahedral) character, thereby maximizing grid quality and solution accuracy near the surface.

  1. Strategies and advanced techniques for marine pollution studies

    SciTech Connect

    Giam, C.S.; Dou, H.J.M.

    1986-01-01

    Here is a review of strategies and techniques for evaluating marine pollution by hazardous organic compounds. Geo-chemical considerations such as the relationship between the inputs, atmospheric and estuarine transport, and the outputs, sedimentation and degradation, guide the decision on appropriate approaches to pollution monitoring in the marine environment. The latest instrumental methods and standard protocols for analysis of organic compounds are presented, as well as advances in interpretation and correlation of data made possible by the accessibility of commercial data bases.

  2. Interval versions of statistical techniques with applications to environmental analysis, bioinformatics, and privacy in statistical databases

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Starks, Scott A.; Xiang, Gang; Beck, Jan; Kandathi, Raj; Nayak, Asis; Ferson, Scott; Hajagos, Janos

    2007-02-01

    In many areas of science and engineering, it is desirable to estimate statistical characteristics (mean, variance, covariance, etc.) under interval uncertainty. For example, we may want to use the measured values x(t) of a pollution level in a lake at different moments of time to estimate the average pollution level; however, we do not know the exact values x(t)--e.g., if one of the measurement results is 0, this simply means that the actual (unknown) value of x(t) can be anywhere between 0 and the detection limit (DL). We must, therefore, modify the existing statistical algorithms to process such interval data. Such a modification is also necessary to process data from statistical databases, where, in order to maintain privacy, we only keep interval ranges instead of the actual numeric data (e.g., a salary range instead of the actual salary). Most resulting computational problems are NP-hard--which means, crudely speaking, that in general, no computationally efficient algorithm can solve all particular cases of the corresponding problem. In this paper, we overview practical situations in which computationally efficient algorithms exist: e.g., situations when measurements are very accurate, or when all the measurements are done with one (or few) instruments. As a case study, we consider a practical problem from bioinformatics: to discover the genetic difference between the cancer cells and the healthy cells, we must process the measurements results and find the concentrations c and h of a given gene in cancer and in healthy cells. This is a particular case of a general situation in which, to estimate states or parameters which are not directly accessible by measurements, we must solve a system of equations in which coefficients are only known with interval uncertainty. We show that in general, this problem is NP-hard, and we describe new efficient algorithms for solving this problem in practically important situations.

  3. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    PubMed

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  4. Advanced endoscopic ultrasound management techniques for preneoplastic pancreatic cystic lesions

    PubMed Central

    Arshad, Hafiz Muhammad Sharjeel; Bharmal, Sheila; Duman, Deniz Guney; Liangpunsakul, Suthat; Turner, Brian G

    2017-01-01

    Pancreatic cystic lesions can be benign, premalignant or malignant. The recent increase in detection and tremendous clinical variability of pancreatic cysts has presented a significant therapeutic challenge to physicians. Mucinous cystic neoplasms are of particular interest given their known malignant potential. This review article provides a brief but comprehensive review of premalignant pancreatic cystic lesions with advanced endoscopic ultrasound (EUS) management approaches. A comprehensive literature search was performed using PubMed, Cochrane, OVID and EMBASE databases. Preneoplastic pancreatic cystic lesions include mucinous cystadenoma and intraductal papillary mucinous neoplasm. The 2012 International Sendai Guidelines guide physicians in their management of pancreatic cystic lesions. Some of the advanced EUS management techniques include ethanol ablation, chemotherapeutic (paclitaxel) ablation, radiofrequency ablation and cryotherapy. In future, EUS-guided injections of drug-eluting beads and neodymium:yttrium aluminum agent laser ablation is predicted to be an integral part of EUS-guided management techniques. In summary, International Sendai Consensus Guidelines should be used to make a decision regarding management of pancreatic cystic lesions. Advanced EUS techniques are proving extremely beneficial in management, especially in those patients who are at high surgical risk. PMID:27574295

  5. Systematic and Statistical Errors Associated with Nuclear Decay Constant Measurements Using the Counting Technique

    NASA Astrophysics Data System (ADS)

    Koltick, David; Wang, Haoyu; Liu, Shih-Chieh; Heim, Jordan; Nistor, Jonathan

    2016-03-01

    Typical nuclear decay constants are measured at the accuracy level of 10-2. There are numerous reasons: tests of unconventional theories, dating of materials, and long term inventory evolution which require decay constants accuracy at a level of 10-4 to 10-5. The statistical and systematic errors associated with precision measurements of decays using the counting technique are presented. Precision requires high count rates, which introduces time dependent dead time and pile-up corrections. An approach to overcome these issues is presented by continuous recording of the detector current. Other systematic corrections include, the time dependent dead time due to background radiation, control of target motion and radiation flight path variation due to environmental conditions, and the time dependent effects caused by scattered events are presented. The incorporation of blind experimental techniques can help make measurement independent of past results. A spectrometer design and data analysis is reviewed that can accomplish these goals. The author would like to thank TechSource, Inc. and Advanced Physics Technologies, LLC. for their support in this work.

  6. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  7. Advanced aeroservoelastic stabilization techniques for hypersonic flight vehicles

    NASA Technical Reports Server (NTRS)

    Chan, Samuel Y.; Cheng, Peter Y.; Myers, Thomas T.; Klyde, David H.; Magdaleno, Raymond E.; Mcruer, Duane T.

    1992-01-01

    Advanced high performance vehicles, including Single-Stage-To-Orbit (SSTO) hypersonic flight vehicles, that are statically unstable, require higher bandwidth flight control systems to compensate for the instability resulting in interactions between the flight control system, the engine/propulsion dynamics, and the low frequency structural modes. Military specifications, such as MIL-F-9490D and MIL-F-87242, tend to limit treatment of structural modes to conventional gain stabilization techniques. The conventional gain stabilization techniques, however, introduce low frequency effective time delays which can be troublesome from a flying qualities standpoint. These time delays can be alleviated by appropriate blending of gain and phase stabilization techniques (referred to as Hybrid Phase Stabilization or HPS) for the low frequency structural modes. The potential of using HPS for compensating structural mode interaction was previously explored. It was shown that effective time delay was significantly reduced with the use of HPS; however, the HPS design was seen to have greater residual response than a conventional gain stablized design. Additional work performed to advance and refine the HPS design procedure, to further develop residual response metrics as a basis for alternative structural stability specifications, and to develop strategies for validating HPS design and specification concepts in manned simulation is presented. Stabilization design sensitivity to structural uncertainties and aircraft-centered requirements are also assessed.

  8. Testing aspects of advanced coherent electron cooling technique

    SciTech Connect

    Litvinenko, V.; Jing, Y.; Pinayev, I.; Wang, G.; Samulyak, R.; Ratner, D.

    2015-05-03

    An advanced version of the Coherent-electron Cooling (CeC) based on the micro-bunching instability was proposed. This approach promises significant increase in the bandwidth of the CeC system and, therefore, significant shortening of cooling time in high-energy hadron colliders. In this paper we present our plans of simulating and testing the key aspects of this proposed technique using the set-up of the coherent-electron-cooling proof-of-principle experiment at BNL.

  9. Measuring the Attitudes of Nigerian and Ghanaian Teachers towards Statistical Techniques in Geography Teaching.

    ERIC Educational Resources Information Center

    Okunrotifa, P. O.

    1980-01-01

    The present paper reports data concerning the attitudes of geography teachers in Nigeria and Ghana towards the use of statistical techniques in university geography teaching, and suggests ways of influencing current feelings in a positive direction. (Author/BW)

  10. Recent Advances in Techniques for Hyperspectral Image Processing

    NASA Technical Reports Server (NTRS)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; Marconcini, Mattia; Tilton, James C.; Trianni, Giovanna

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  11. Advanced IMCW Lidar Techniques for ASCENDS CO2 Column Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel; lin, bing; nehrir, amin; harrison, fenton; obland, michael

    2015-04-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation.

  12. Reducing Anxiety and Increasing Self-Efficacy within an Advanced Graduate Psychology Statistics Course

    ERIC Educational Resources Information Center

    McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley

    2015-01-01

    In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…

  13. Using the Student Research Project to Integrate Macroeconomics and Statistics in an Advanced Cost Accounting Course

    ERIC Educational Resources Information Center

    Hassan, Mahamood M.; Schwartz, Bill N.

    2014-01-01

    This paper discusses a student research project that is part of an advanced cost accounting class. The project emphasizes active learning, integrates cost accounting with macroeconomics and statistics by "learning by doing" using real world data. Students analyze sales data for a publicly listed company by focusing on the company's…

  14. Measuring the microbiome: perspectives on advances in DNA-based techniques for exploring microbial life

    PubMed Central

    Bunge, John; Gilbert, Jack A.; Moore, Jason H.

    2012-01-01

    This article reviews recent advances in ‘microbiome studies’: molecular, statistical and graphical techniques to explore and quantify how microbial organisms affect our environments and ourselves given recent increases in sequencing technology. Microbiome studies are moving beyond mere inventories of specific ecosystems to quantifications of community diversity and descriptions of their ecological function. We review the last 24 months of progress in this sort of research, and anticipate where the next 2 years will take us. We hope that bioinformaticians will find this a helpful springboard for new collaborations with microbiologists. PMID:22308073

  15. Advanced Techniques for Removal of Retrievable Inferior Vena Cava Filters

    SciTech Connect

    Iliescu, Bogdan; Haskal, Ziv J.

    2012-08-15

    Inferior vena cava (IVC) filters have proven valuable for the prevention of primary or recurrent pulmonary embolism in selected patients with or at high risk for venous thromboembolic disease. Their use has become commonplace, and the numbers implanted increase annually. During the last 3 years, in the United States, the percentage of annually placed optional filters, i.e., filters than can remain as permanent filters or potentially be retrieved, has consistently exceeded that of permanent filters. In parallel, the complications of long- or short-term filtration have become increasingly evident to physicians, regulatory agencies, and the public. Most filter removals are uneventful, with a high degree of success. When routine filter-retrieval techniques prove unsuccessful, progressively more advanced tools and skill sets must be used to enhance filter-retrieval success. These techniques should be used with caution to avoid damage to the filter or cava during IVC retrieval. This review describes the complex techniques for filter retrieval, including use of additional snares, guidewires, angioplasty balloons, and mechanical and thermal approaches as well as illustrates their specific application.

  16. Advanced Synchrotron Techniques at High Pressure Collaborative Access Team (HPCAT)

    NASA Astrophysics Data System (ADS)

    Shen, G.; Sinogeikin, S. V.; Chow, P.; Kono, Y.; Meng, Y.; Park, C.; Popov, D.; Rod, E.; Smith, J.; Xiao, Y.; Mao, H.

    2012-12-01

    High Pressure Collaborative Access Team (HPCAT) is dedicated to advancing cutting-edge, multidisciplinary, high-pressure science and technology using synchrotron radiation at Sector 16 of the Advanced Photon Source (APS) of Argonne National Laboratory. At HPCAT an array of novel x-ray diffraction and spectroscopic techniques has been integrated with high pressure and extreme temperature instrumentation for studies of structure and materials properties at extreme conditions.. HPCAT consists of four active independent beamlines performing a large range of various experiments at extreme conditions. 16BM-B beamline is dedicated to energy dispersive and white Laue X-ray diffraction. The majority of experiments are performed with a Paris-Edinburgh large volume press (to 7GPa and 2500K) and include amorphous and liquid structure measurement, white beam radiography, elastic sound wave velocity measurement of amorphous solid materials, with viscosity and density measurement of liquid being under development. 16BM-D is a monochromatic diffraction beamline for powder and single crystal diffraction at high pressure and high (resistive heating) / low (cryostats) temperature. The additional capabilities include high-resolution powder diffraction and x-ray absorption near edge structure (XANES) spectroscopy. The insertion device beamline of HPCAT has two undulators in canted mode (operating independently) and LN cooled Si monochromators capable of providing a large range of energies. 16IDB is a microdiffraction beamline mainly focusing on high-pressure powder and single crystal diffraction in DAC at high temperatures (double-sided laser heating and resistive heating) and low temperature (various cryostats). The modern instrumentation allows high-quality diffraction at megabar pressures from light element, fast experiments with pulsed laser heating, fast dynamic experiments with Pilatus detector, and so on. 16ID-D beamline is dedicated to x-ray scattering and spectroscopy research

  17. Advanced Fibre Bragg Grating and Microfibre Bragg Grating Fabrication Techniques

    NASA Astrophysics Data System (ADS)

    Chung, Kit Man

    Fibre Bragg gratings (FBGs) have become a very important technology for communication systems and fibre optic sensing. Typically, FBGs are less than 10-mm long and are fabricated using fused silica uniform phase masks which become more expensive for longer length or non-uniform pitch. Generally, interference UV laser beams are employed to make long or complex FBGs, and this technique introduces critical precision and control issues. In this work, we demonstrate an advanced FBG fabrication system that enables the writing of long and complex gratings in optical fibres with virtually any apodisation profile, local phase and Bragg wavelength using a novel optical design in which the incident angles of two UV beams onto an optical fibre can be adjusted simultaneously by moving just one optical component, instead of two optics employed in earlier configurations, to vary the grating pitch. The key advantage of the grating fabrication system is that complex gratings can be fabricated by controlling the linear movements of two translation stages. In addition to the study of advanced grating fabrication technique, we also focus on the inscription of FBGs written in optical fibres with a cladding diameter of several ten's of microns. Fabrication of microfibres was investigated using a sophisticated tapering method. We also proposed a simple but practical technique to filter out the higher order modes reflected from the FBG written in microfibres via a linear taper region while the fundamental mode re-couples to the core. By using this technique, reflection from the microfibre Bragg grating (MFBG) can be effectively single mode, simplifying the demultiplexing and demodulation processes. MFBG exhibits high sensitivity to contact force and an MFBG-based force sensor was also constructed and tested to investigate their suitability for use as an invasive surgery device. Performance of the contact force sensor packaged in a conforming elastomer material compares favourably to one

  18. Recalibration of CFS seasonal precipitation forecasts using statistical techniques for bias correction

    NASA Astrophysics Data System (ADS)

    Bliefernicht, Jan; Laux, Patrick; Siegmund, Jonatan; Kunstmann, Harald

    2013-04-01

    The development and application of statistical techniques with a special focus on a recalibration of meteorological or hydrological forecasts to eliminate the bias between forecasts and observations has received a great deal of attention in recent years. One reason is that retrospective forecasts are nowadays available which allows for a proper training and validation of this kind of techniques. The objective of this presentation is to propose several statistical techniques with different degree of complexity and to evaluate and compare their performance for a recalibration of seasonal ensemble forecasts of monthly precipitation. The techniques selected in this study range from straightforward normal score and quantile-quantile transformation, local scaling, to more sophisticated and novel statistical techniques such as Copula-based methodology recently proposed by Laux et al. (2011). The seasonal forecasts are derived from the Climate Forecast System Version 2. This version is the current coupled ocean-atmosphere general circulation model of the U.S. National Centers for Environmental Prediction used to provide forecasts up to nine months. The CFS precipitation forecasts are compared to monthly precipitation observations from the Global Precipitation Climatology Centre. The statistical techniques are tested for semi-arid regions in West Africa and the Indian subcontinent focusing on large-scale river basins such as the Ganges and the Volta basin. In both regions seasonal precipitation forecasts are a crucial source of information for the prediction of hydro-meteorological extremes, in particular for droughts. The evaluation is done using retrospective CFS ensemble forecast from 1982 to 2009. The training of the statistical techniques is done in a cross-validation mode. The outcome of this investigation illustrates large systematic differences between forecasts and observations, in particular for the Volta basin in West Africa. The selection of straightforward

  19. Advanced Cytologic Techniques for the Detection of Malignant Pancreatobiliary Strictures

    PubMed Central

    Moreno Luna, Laura E.; Kipp, Benjamin; Halling, Kevin C.; Sebo, Thomas J.; Kremers., Walter K.; Roberts, Lewis R.; Barr Fritcher, Emily G.; Levy, Michael J.; Gores, Gregory J.

    2006-01-01

    Background & Aims Two advanced cytologic techniques for detecting aneuploidy, digital image analysis (DIA) and fluorescence in situ hybridization (FISH) have recently been developed to help identify malignant pancreatobiliary strictures. The aim of this study was to assess the clinical utility of cytology, DIA, and FISH for the identification of malignant pancreatobiliary strictures. Methods Brush cytologic specimens from 233 consecutive patients undergoing ERCP for pancreatobiliary strictures were examined by all three techniques. Strictures were stratified as proximal (n=33) or distal (n=114) based on whether they occurred above or below the cystic duct, respectively. Strictures in patients with PSC (n=86) were analyzed separately. Results Despite the stratification, the performances of the tests were similar. Routine cytology has a low sensitivity (5–20%) but 100% specificity. Because of the high specificity for cytology, we assessed the performance of the other tests when routine cytology was negative. In this clinical context, FISH had an increased sensitivity (35–60%) when assessing for chromosomal gains (polysomy) while preserving the specificity of cytology. The sensitivity and specificity of DIA was intermediate as compared to routine cytology and FISH, but was additive to FISH values demonstrating only trisomy of chromosome 7 or chromosome 3. Conclusions These findings suggest that FISH and DIA increase the sensitivity for the diagnosis of malignant pancreatobiliary tract strictures over that obtained by conventional cytology while maintaining an acceptable specificity. PMID:17030177

  20. Lightweight and Statistical Techniques for Petascale Debugging: Correctness on Petascale Systems (CoPS) Preliminry Report

    SciTech Connect

    de Supinski, B R; Miller, B P; Liblit, B

    2011-09-13

    Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques. Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two

  1. Recent Advances in Seismic Wavefront Tracking Techniques and Their Applications

    NASA Astrophysics Data System (ADS)

    Sambridge, M.; Rawlinson, N.; Hauser, J.

    2007-12-01

    In observational seismology, wavefront tracking techniques are becoming increasingly popular as a means of predicting two point traveltimes and their associated paths. Possible applications include reflection migration, earthquake relocation and seismic tomography at a wide variety of scales. Compared with traditional ray based techniques such as shooting and bending, wavefront tracking has the advantage of locating traveltimes between the source and every point in the medium; in many cases, improved efficiency and robustness; and greater potential for tracking multiple arrivals. In this presentation, two wavefront tracking techniques will be considered: the so-called Fast Marching Method (FMM), and a wavefront construction (WFC) scheme. Over the last several years, FMM has become a mature technique in seismology, with a number of improvements to the underlying theory and the release of software tools that allow it to be used in a variety of applications. At its core, FMM is a grid based solver that implicitly tracks a propagating wavefront by seeking finite difference solutions to the eikonal equation along an evolving narrow band. Recent developments include the use of source grid refinement to improve accuracy, the introduction of a multi-stage scheme to allow reflections and refractions to be tracked in layered media, and extension to spherical coordinates. Implementation of these ideas has led to a number of different applications, including teleseismic tomography, wide-angle reflection and refraction tomography, earthquake relocation, and ambient noise imaging using surface waves. The WFC scheme represents the wavefront surface as a set of points in 6-D phase space; these points are advanced in time using local initial value ray tracing in order to form a sequence of wavefront surfaces that fill the model volume. Surface refinement and simplification techniques inspired by recent developments in computer graphics are used to maintain a fixed density of nodes

  2. Basics, common errors and essentials of statistical tools and techniques in anesthesiology research.

    PubMed

    Bajwa, Sukhminder Jit Singh

    2015-01-01

    The statistical portion is a vital component of any research study. The research methodology and the application of statistical tools and techniques have evolved over the years and have significantly helped the research activities throughout the globe. The results and inferences are not accurately possible without proper validation with various statistical tools and tests. The evidencebased anesthesia research and practice has to incorporate statistical tools in the methodology right from the planning stage of the study itself. Though the medical fraternity is well acquainted with the significance of statistics in research, there is a lack of in-depth knowledge about the various statistical concepts and principles among majority of the researchers. The clinical impact and consequences can be serious as the incorrect analysis, conclusions, and false results may construct an artificial platform on which future research activities are replicated. The present tutorial is an attempt to make anesthesiologists aware of the various aspects of statistical methods used in evidence-based research and also to highlight the common areas where maximum number of statistical errors are committed so as to adopt better statistical practices.

  3. A survey of image processing techniques and statistics for ballistic specimens in forensic science.

    PubMed

    Gerules, George; Bhatia, Sanjiv K; Jackson, Daniel E

    2013-06-01

    This paper provides a review of recent investigations on the image processing techniques used to match spent bullets and cartridge cases. It is also, to a lesser extent, a review of the statistical methods that are used to judge the uniqueness of fired bullets and spent cartridge cases. We review 2D and 3D imaging techniques as well as many of the algorithms used to match these images. We also provide a discussion of the strengths and weaknesses of these methods for both image matching and statistical uniqueness. The goal of this paper is to be a reference for investigators and scientists working in this field.

  4. A review of analytical techniques for gait data. Part 1: Fuzzy, statistical and fractal methods.

    PubMed

    Chau, T

    2001-02-01

    In recent years, several new approaches to gait data analysis have been explored, including fuzzy systems, multivariate statistical techniques and fractal dynamics. Through a critical survey of recent gait studies, this paper reviews the potential of these methods to strengthen the gait laboratory's analytical arsenal. It is found that time-honoured multivariate statistical methods are the most widely applied and understood. Although initially promising, fuzzy and fractal analyses of gait data remain largely unknown and their full potential is yet to be realized. The trend towards fusing multiple techniques in a given analysis means that additional research into the application of these two methods will benefit gait data analysis.

  5. Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?

    PubMed Central

    Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie

    2012-01-01

    A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746

  6. Accuracy Evaluation of a Mobile Mapping System with Advanced Statistical Methods

    NASA Astrophysics Data System (ADS)

    Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A.

    2015-02-01

    This paper discusses a methodology to evaluate the precision and the accuracy of a commercial Mobile Mapping System (MMS) with advanced statistical methods. So far, the metric potentialities of this emerging mapping technology have been studied in few papers, where generally the assumption that errors follow a normal distribution is made. In fact, this hypothesis should be carefully verified in advance, in order to test how well the Gaussian classic statistics can adapt to datasets that are usually affected by asymmetrical gross errors. The workflow adopted in this study relies on a Gaussian assessment, followed by an outlier filtering process. Finally, non-parametric statistical models are applied, in order to achieve a robust estimation of the error dispersion. Among the different MMSs available on the market, the latest solution provided by RIEGL is here tested, i.e. the VMX-450 Mobile Laser Scanning System. The test-area is the historic city centre of Trento (Italy), selected in order to assess the system performance in dealing with a challenging and historic urban scenario. Reference measures are derived from photogrammetric and Terrestrial Laser Scanning (TLS) surveys. All datasets show a large lack of symmetry that leads to the conclusion that the standard normal parameters are not adequate to assess this type of data. The use of non-normal statistics gives thus a more appropriate description of the data and yields results that meet the quoted a-priori errors.

  7. Advances in Poly(4-aminodiphenylaniline) Nanofibers Preparation by Electrospinning Technique.

    PubMed

    Della Pina, C; Busacca, C; Frontera, P; Antonucci, P L; Scarpino, L A; Sironi, A; Falletta, E

    2016-05-01

    Polyaniline (PANI) nanofibers are drawing a great deal of interest from academia and industry due to their multiple applications, especially in biomedical field. PANI nanofibers were successfully electrospun for the first time by MacDiarmid and co-workers at the beginning of the millennium and since then many efforts have been addressed to improve their quality. However, traditional PANI prepared from aniline monomer shows some drawbacks, such as presence of toxic (i.e., benzidine) and inorganic (salts and metals) co-products, that complicate polymer post-treatment, and low solubility in common organic solvents, making hard its processing by electrospinning technique. Some industrial sectors, such as medical and biomedical, need to employ materials free from toxic and polluting species. In this regard, the oxidative polymerization of N-(4-aminophenyl)aniline, aniline dimer, to produce poly(4-aminodiphenylaniline), P4ADA, a kind of PANI, represents an innovative alternative to the traditional synthesis because the obtained polymer results free from carcinogenic and/or polluting co-products, and, moreover, more soluble than traditional PANI. This latter feature can be exploited to obtain P4ADA nanofibers by electrospinning technique. In this paper we report the advances obtained in the P4ADA nanofibers electrospinnig. A comparison among polyethylene oxide (PEO), polymethyl methacrylate (PMMA) and polystyrene (PS), as the second polymer to facilitate the electrospinning process, is shown. In order to increase the conductivity of P4ADA nanofibers, two strategies were adopted and compared: selective insulating binder removal from electrospun nanofibers by a rinsing tratment, afterwards optimizing the minimum amount of binder necessary for the electrospinning process. Moreover, the effect of PEO/P4ADA weight ratio on the fibers morphology and conductivity was highlighted.

  8. A review of hemorheology: Measuring techniques and recent advances

    NASA Astrophysics Data System (ADS)

    Sousa, Patrícia C.; Pinho, Fernando T.; Alves, Manuel A.; Oliveira, Mónica S. N.

    2016-02-01

    Significant progress has been made over the years on the topic of hemorheology, not only in terms of the development of more accurate and sophisticated techniques, but also in terms of understanding the phenomena associated with blood components, their interactions and impact upon blood properties. The rheological properties of blood are strongly dependent on the interactions and mechanical properties of red blood cells, and a variation of these properties can bring further insight into the human health state and can be an important parameter in clinical diagnosis. In this article, we provide both a reference for hemorheological research and a resource regarding the fundamental concepts in hemorheology. This review is aimed at those starting in the field of hemodynamics, where blood rheology plays a significant role, but also at those in search of the most up-to-date findings (both qualitative and quantitative) in hemorheological measurements and novel techniques used in this context, including technical advances under more extreme conditions such as in large amplitude oscillatory shear flow or under extensional flow, which impose large deformations comparable to those found in the microcirculatory system and in diseased vessels. Given the impressive rate of increase in the available knowledge on blood flow, this review is also intended to identify areas where current knowledge is still incomplete, and which have the potential for new, exciting and useful research. We also discuss the most important parameters that can lead to an alteration of blood rheology, and which as a consequence can have a significant impact on the normal physiological behavior of blood.

  9. Advanced Techniques for Power System Identification from Measured Data

    SciTech Connect

    Pierre, John W.; Wies, Richard; Trudnowski, Daniel

    2008-11-25

    Time-synchronized measurements provide rich information for estimating a power-system's electromechanical modal properties via advanced signal processing. This information is becoming critical for the improved operational reliability of interconnected grids. A given mode's properties are described by its frequency, damping, and shape. Modal frequencies and damping are useful indicators of power-system stress, usually declining with increased load or reduced grid capacity. Mode shape provides critical information for operational control actions. This project investigated many advanced techniques for power system identification from measured data focusing on mode frequency and damping ratio estimation. Investigators from the three universities coordinated their effort with Pacific Northwest National Laboratory (PNNL). Significant progress was made on developing appropriate techniques for system identification with confidence intervals and testing those techniques on field measured data and through simulation. Experimental data from the western area power system was provided by PNNL and Bonneville Power Administration (BPA) for both ambient conditions and for signal injection tests. Three large-scale tests were conducted for the western area in 2005 and 2006. Measured field PMU (Phasor Measurement Unit) data was provided to the three universities. A 19-machine simulation model was enhanced for testing the system identification algorithms. Extensive simulations were run with this model to test the performance of the algorithms. University of Wyoming researchers participated in four primary activities: (1) Block and adaptive processing techniques for mode estimation from ambient signals and probing signals, (2) confidence interval estimation, (3) probing signal design and injection method analysis, and (4) performance assessment and validation from simulated and field measured data. Subspace based methods have been use to improve previous results from block processing

  10. Nuclear Technology. Course 26: Metrology. Module 27-7, Statistical Techniques in Metrology.

    ERIC Educational Resources Information Center

    Espy, John; Selleck, Ben

    This seventh in a series of eight modules for a course titled Metrology focuses on descriptive and inferential statistical techniques in metrology. The module follows a typical format that includes the following sections: (1) introduction, (2) module prerequisites, (3) objectives, (4) notes to instructor/student, (5) subject matter, (6) materials…

  11. Analysing Change in Learning Strategies over Time: A Comparison of Three Statistical Techniques

    ERIC Educational Resources Information Center

    Coertjens, Liesje; van Daal, Tine; Donche, Vincent; De Maeyer, Sven; Vanthournout, Gert; Van Petegem, Peter

    2013-01-01

    Change in learning strategies during higher education is an important topic of research when considering students' approaches to learning. Regarding the statistical techniques used to analyse this change, repeated measures ANOVA is mostly relied upon. Recently, multilevel and multi-indicator latent growth (MILG) analyses have been used as well.…

  12. Statistical Techniques Used in Published Articles: A Historical Review of Reviews

    ERIC Educational Resources Information Center

    Skidmore, Susan Troncoso; Thompson, Bruce

    2010-01-01

    The purpose of the present study is to provide a historical account and metasynthesis of which statistical techniques are most frequently used in the fields of education and psychology. Six articles reviewing the "American Educational Research Journal" from 1969 to 1997 and five articles reviewing the psychological literature from 1948 to 2001…

  13. Pediatric Cardiopulmonary Resuscitation: Advances in Science, Techniques, and Outcomes

    PubMed Central

    Topjian, Alexis A.; Berg, Robert A.; Nadkarni, Vinay M.

    2009-01-01

    More than 25% of children survive to hospital discharge after in-hospital cardiac arrests, and 5% to 10% survive after out-of-hospital cardiac arrests. This review of pediatric cardiopulmonary resuscitation addresses the epidemiology of pediatric cardiac arrests, mechanisms of coronary blood flow during cardiopulmonary resuscitation, the 4 phases of cardiac arrest resuscitation, appropriate interventions during each phase, special resuscitation circumstances, extracorporeal membrane oxygenation cardiopulmonary resuscitation, and quality of cardiopulmonary resuscitation. The key elements of pathophysiology that impact and match the timing, intensity, duration, and variability of the hypoxic-ischemic insult to evidence-based interventions are reviewed. Exciting discoveries in basic and applied-science laboratories are now relevant for specific subpopulations of pediatric cardiac arrest victims and circumstances (eg, ventricular fibrillation, neonates, congenital heart disease, extracorporeal cardiopulmonary resuscitation). Improving the quality of interventions is increasingly recognized as a key factor for improving outcomes. Evolving training strategies include simulation training, just-in-time and just-in-place training, and crisis-team training. The difficult issue of when to discontinue resuscitative efforts is addressed. Outcomes from pediatric cardiac arrests are improving. Advances in resuscitation science and state-of-the-art implementation techniques provide the opportunity for further improvement in outcomes among children after cardiac arrest. PMID:18977991

  14. Recommended advanced techniques for waterborne pathogen detection in developing countries.

    PubMed

    Alhamlan, Fatimah S; Al-Qahtani, Ahmed A; Al-Ahdal, Mohammed N

    2015-02-19

    The effect of human activities on water resources has expanded dramatically during the past few decades, leading to the spread of waterborne microbial pathogens. The total global health impact of human infectious diseases associated with pathogenic microorganisms from land-based wastewater pollution was estimated to be approximately three million disability-adjusted life years (DALY), with an estimated economic loss of nearly 12 billion US dollars per year. Although clean water is essential for healthy living, it is not equally granted to all humans. Indeed, people who live in developing countries are challenged every day by an inadequate supply of clean water. Polluted water can lead to health crises that in turn spread waterborne pathogens. Taking measures to assess the water quality can prevent these potential risks. Thus, a pressing need has emerged in developing countries for comprehensive and accurate assessments of water quality. This review presents current and emerging advanced techniques for assessing water quality that can be adopted by authorities in developing countries.

  15. Dissecting cell adhesion architecture using advanced imaging techniques

    PubMed Central

    Morton, Penny E

    2011-01-01

    Cell adhesion to extracellular matrix proteins or to other cells is essential for the control of embryonic development, tissue integrity, immune function and wound healing. Adhesions are tightly spatially regulated structures containing over one hundred different proteins that coordinate both dynamics and signaling events at these sites. Extensive biochemical and morphological analysis of adhesion types over the past three decades has greatly improved understanding of individual protein contributions to adhesion signaling and, in some cases, dynamics. However, it is becoming increasingly clear that these diverse macromolecular complexes contain a variety of protein sub-networks, as well as distinct sub-domains that likely play important roles in regulating adhesion behavior. Until recently, resolving these structures, which are often less than a micron in size, was hampered by the limitations of conventional light microscopy. However, recent advances in optical techniques and imaging methods have revealed exciting insight into the intricate control of adhesion structure and assembly. Here we provide an overview of the recent data arising from such studies of cell:matrix and cell:cell contact and an overview of the imaging strategies that have been applied to study the intricacies and hierarchy of proteins within adhesions. PMID:21785274

  16. Nanocrystalline materials: recent advances in crystallographic characterization techniques

    PubMed Central

    Ringe, Emilie

    2014-01-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask ‘how are nanoshapes created?’, ‘how does the shape relate to the atomic packing and crystallography of the material?’, ‘how can we control and characterize the external shape and crystal structure of such small nanocrystals?’. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed. PMID:25485133

  17. REVIEW ARTICLE: Emission measurement techniques for advanced powertrains

    NASA Astrophysics Data System (ADS)

    Adachi, Masayuki

    2000-10-01

    Recent developments in high-efficiency low-emission powertrains require the emission measurement technologies to be able to detect regulated and unregulated compounds with very high sensitivity and a fast response. For example, levels of a variety of nitrogen compounds and sulphur compounds should be analysed in real time in order to develop aftertreatment systems to decrease emission of NOx for the lean burning powertrains. Also, real-time information on the emission of particulate matter for the transient operation of diesel engines and direct injection gasoline engines is invaluable. The present paper reviews newly introduced instrumentation for such emission measurement that is demanded for the developments in advanced powertrain systems. They include Fourier transform infrared spectroscopy, mass spectrometry and fast response flame ionization detection. In addition, demands and applications of the fuel reformer developments for fuel cell electric vehicles are discussed. Besides the detection methodologies, sample handling techniques for the measurement of concentrations emitted from low emission vehicles for which the concentrations of the pollutants are significantly lower than the concentrations present in ambient air, are also described.

  18. Development of advanced strain diagnostic techniques for reactor environments.

    SciTech Connect

    Fleming, Darryn D.; Holschuh, Thomas Vernon,; Miller, Timothy J.; Hall, Aaron Christopher; Urrea, David Anthony,; Parma, Edward J.,

    2013-02-01

    The following research is operated as a Laboratory Directed Research and Development (LDRD) initiative at Sandia National Laboratories. The long-term goals of the program include sophisticated diagnostics of advanced fuels testing for nuclear reactors for the Department of Energy (DOE) Gen IV program, with the future capability to provide real-time measurement of strain in fuel rod cladding during operation in situ at any research or power reactor in the United States. By quantifying the stress and strain in fuel rods, it is possible to significantly improve fuel rod design, and consequently, to improve the performance and lifetime of the cladding. During the past year of this program, two sets of experiments were performed: small-scale tests to ensure reliability of the gages, and reactor pulse experiments involving the most viable samples in the Annulated Core Research Reactor (ACRR), located onsite at Sandia. Strain measurement techniques that can provide useful data in the extreme environment of a nuclear reactor core are needed to characterize nuclear fuel rods. This report documents the progression of solutions to this issue that were explored for feasibility in FY12 at Sandia National Laboratories, Albuquerque, NM.

  19. Nanocrystalline materials: recent advances in crystallographic characterization techniques.

    PubMed

    Ringe, Emilie

    2014-11-01

    Most properties of nanocrystalline materials are shape-dependent, providing their exquisite tunability in optical, mechanical, electronic and catalytic properties. An example of the former is localized surface plasmon resonance (LSPR), the coherent oscillation of conduction electrons in metals that can be excited by the electric field of light; this resonance frequency is highly dependent on both the size and shape of a nanocrystal. An example of the latter is the marked difference in catalytic activity observed for different Pd nanoparticles. Such examples highlight the importance of particle shape in nanocrystalline materials and their practical applications. However, one may ask 'how are nanoshapes created?', 'how does the shape relate to the atomic packing and crystallography of the material?', 'how can we control and characterize the external shape and crystal structure of such small nanocrystals?'. This feature article aims to give the reader an overview of important techniques, concepts and recent advances related to these questions. Nucleation, growth and how seed crystallography influences the final synthesis product are discussed, followed by shape prediction models based on seed crystallography and thermodynamic or kinetic parameters. The crystallographic implications of epitaxy and orientation in multilayered, core-shell nanoparticles are overviewed, and, finally, the development and implications of novel, spatially resolved analysis tools are discussed.

  20. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  1. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  2. Use of statistical techniques to account for parameter uncertainty in landslide tsunami generation

    NASA Astrophysics Data System (ADS)

    Salmanidou, Dimitra; Guillas, Serge; Georgiopoulou, Aggeliki; Dias, Frederic

    2016-04-01

    Landslide tsunamis constitute complex phenomena, the nature of which is governed by varying rheological and geomorphological parameters. In an attempt to understand better these mechanisms, statistical methods can be used to quantify uncertainty and carry out sensitivity analyses. Such a method is the statistical emulation of the numerical code used to model a phenomenon. In comparison to numerical simulators, statistical emulators have the advantage of being faster and less expensive to run. In this study we implement a Bayesian calibration which allows us to build a statistical surrogate of the numerical simulators used to model submarine sliding and tsunami generation in the Rockall Bank Slide Complex, NE Atlantic Ocean. For the parameter selection and numerical simulations of the event we make use of a sophisticated sampling technique (Latin Hypercube Sampling). The posterior distributions of the parameters and the predictions made with the emulator are provided.

  3. ADVANCED TECHNIQUES FOR RESERVOIR SIMULATION AND MODELING OF NONCONVENTIONAL WELLS

    SciTech Connect

    Louis J. Durlofsky; Khalid Aziz

    2004-08-20

    Nonconventional wells, which include horizontal, deviated, multilateral and ''smart'' wells, offer great potential for the efficient management of oil and gas reservoirs. These wells are able to contact larger regions of the reservoir than conventional wells and can also be used to target isolated hydrocarbon accumulations. The use of nonconventional wells instrumented with downhole inflow control devices allows for even greater flexibility in production. Because nonconventional wells can be very expensive to drill, complete and instrument, it is important to be able to optimize their deployment, which requires the accurate prediction of their performance. However, predictions of nonconventional well performance are often inaccurate. This is likely due to inadequacies in some of the reservoir engineering and reservoir simulation tools used to model and optimize nonconventional well performance. A number of new issues arise in the modeling and optimization of nonconventional wells. For example, the optimal use of downhole inflow control devices has not been addressed for practical problems. In addition, the impact of geological and engineering uncertainty (e.g., valve reliability) has not been previously considered. In order to model and optimize nonconventional wells in different settings, it is essential that the tools be implemented into a general reservoir simulator. This simulator must be sufficiently general and robust and must in addition be linked to a sophisticated well model. Our research under this five year project addressed all of the key areas indicated above. The overall project was divided into three main categories: (1) advanced reservoir simulation techniques for modeling nonconventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and for coupling the well to the simulator (which includes the accurate calculation of well index and the modeling of multiphase flow in the wellbore

  4. Weldability and joining techniques for advanced fossil energy system alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Liu, W.; Yang, D.; Zhou, G.; Morrison, M.

    1998-05-01

    The efforts represent the concerns for the basic understanding of the weldability and fabricability of the advanced high temperature alloys so necessary to affect increases in the efficiency of the next generation Fossil Energy Power Plants. The effort was divided into three tasks with the first effort dealing with the welding and fabrication behavior of 310HCbN (HR3C), the second task details the studies aimed at understanding the weldability of a newly developed 310TaN high temperature stainless (a modification of 310 stainless) and Task 3 addressed the cladding of austenitic tubing with Iron-Aluminide using the GTAW process. Task 1 consisted of microstructural studies on 310HCbN and the development of a Tube Weldability test which has applications to production welding techniques as well as laboratory weldability assessments. In addition, the evaluation of ex-service 310HCbN which showed fireside erosion and cracking at the attachment weld locations was conducted. Task 2 addressed the behavior of the newly developed 310 TaN modification of standard 310 stainless steel and showed that the weldability was excellent and that the sensitization potential was minimal for normal welding and fabrication conditions. The microstructural evolution during elevated temperature testing was characterized and the second phase particles evolved upon aging were identified. Task 3 details the investigation undertaken to clad 310HCbN tubing with Iron Aluminide and developed welding conditions necessary to provide a crack free cladding. The work showed that both a preheat and a post-heat was necessary for crack free deposits and the effect of a third element on the cracking potential was defined together with the effect of the aluminum level for optimum weldability.

  5. Statistical Studies of Flux Transfer Events Using Unsupervised and Supervised Techniques

    NASA Astrophysics Data System (ADS)

    Driscoll, J.; Sipes, T. B.; Karimabadi, H.; Sibeck, D. G.; Korotova, G. I.

    2006-12-01

    We report preliminary results concerning the combined use of unsupervised and supervised techniques to classify Geotail FTEs. Currently, humans identify FTEs on the basis of clear isolated bipolar signatures normal to the nominal magnetopause, magnetic field strength enhancements, and sometimes east/west deflections of the magnetic field in the plane of the magnetopause BM. However, events with decreases or crater-like structures in the magnetic field strength, no east/west deflection, and asymmetric or continuous variations normal to the magnetopause have also been identified as FTEs, making statistical studies of FTEs problematical. Data mining techniques are particularly useful in developing automated search algorithms and generating large event lists for statistical studies. Data mining techniques can be divided into two types, supervised and unsupervised. In supervised algorithms, one teaches the algorithm using examples from labeled data. Considering the case of FTEs, the user would provide examples of FTEs as well as examples of non-FTEs and label (as FTE or non-FTE) the data. Since one has to start with a labeled data set, this may already include a user bias in the selection process. To avoid this issue, it can be useful to employ unsupervised techniques. Unsupervised techniques are analogous to training without a teacher: data are not labeled. There is also hybrid modeling where one makes several models, using unsupervised and supervised techniques and then connects them into a hybrid model.

  6. Advanced statistical process control: controlling sub-0.18-μm lithography and other processes

    NASA Astrophysics Data System (ADS)

    Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.

    2001-08-01

    Feed-forward, as a method to control the Lithography process for Critical Dimensions and Overlay, is well known in the semiconductors industry. However, the control provided by simple averaging feed-forward methodologies is not sufficient to support the complexity of a sub-0.18micrometers lithography process. Also, simple feed-forward techniques are not applicable for logics and ASIC production due to many different products, lithography chemistry combinations and the short memory of the averaging method. In the semiconductors industry, feed-forward control applications are generally called APC, Advanced Process Control applications. Today, there are as many APC methods as the number of engineers involved. To meet the stringent requirements of 0.18 micrometers production, we selected a method that is described in SPIE 3998-48 (March 2000) by Terrence Zavecz and Rene Blanquies from Yield Dynamics Inc. This method is called PPC, Predictive Process Control, and employs a methodology of collecting measurement results and the modeled bias attributes of expose tools, reticles and the incoming process in a signatures database. With PPC, before each lot exposure, the signatures of the lithography tool, the reticle and the incoming process are used to predict the setup of the lot process and the expected lot results. Benefits derived from such an implementation are very clear; there is no limitation of the number of products or lithography-chemistry combinations and the technique avoids the short memory of conventional APC techniques. ... and what's next? (Rob Morton, Philips assignee to International Sematech). The next part of the paper will try to answer this question. Observing that CMP and metal deposition significantly influence CD's and overlay results, and even Contact Etch can have a significant influence on Metal 5 overlay, we developed a more general PPC for lithography. Starting with the existing lithography PPC applications database, the authors extended the

  7. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  8. Investigation of joining techniques for advanced austenitic alloys

    SciTech Connect

    Lundin, C.D.; Qiao, C.Y.P.; Kikuchi, Y.; Shi, C.; Gill, T.P.S.

    1991-05-01

    Modified Alloys 316 and 800H, designed for high temperature service, have been developed at Oak Ridge National Laboratory. Assessment of the weldability of the advanced austenitic alloys has been conducted at the University of Tennessee. Four aspects of weldability of the advanced austenitic alloys were included in the investigation.

  9. Evaluation of spatial and temporal variations in marine sediments quality using multivariate statistical techniques.

    PubMed

    Alvarez, Odalys Quevedo; Tagle, Margarita Edelia Villanueva; Pascual, Jorge L Gómez; Marín, Ma Teresa Larrea; Clemente, Ana Catalina Nuñez; Medina, Miriam Odette Cora; Palau, Raiza Rey; Alfonso, Mario Simeón Pomares

    2014-10-01

    Spatial and temporal variations of sediment quality in Matanzas Bay (Cuba) were studied by determining a total of 12 variables (Zn, Cu, Pb, As, Ni, Co, Al, Fe, Mn, V, CO₃²⁻, and total hydrocarbons (THC). Surface sediments were collected, annually, at eight stations during 2005-2008. Multivariate statistical techniques, such as principal component (PCA), cluster (CA), and lineal discriminant (LDA) analyses were applied for identification of the most significant variables influencing the environmental quality of sediments. Heavy metals (Zn, Cu, Pb, V, and As) and THC were the most significant species contributing to sediment quality variations during the sampling period. Concentrations of V and As were determined in sediments of this ecosystem for the first time. The variation of sediment environmental quality with the sampling period and the differentiation of samples in three groups along the bay were obtained. The usefulness of the multivariate statistical techniques employed for the environmental interpretation of a limited dataset was confirmed.

  10. Data on electrical energy conservation using high efficiency motors for the confidence bounds using statistical techniques.

    PubMed

    Shaikh, Muhammad Mujtaba; Memon, Abdul Jabbar; Hussain, Manzoor

    2016-09-01

    In this article, we describe details of the data used in the research paper "Confidence bounds for energy conservation in electric motors: An economical solution using statistical techniques" [1]. The data presented in this paper is intended to show benefits of high efficiency electric motors over the standard efficiency motors of similar rating in the industrial sector of Pakistan. We explain how the data was collected and then processed by means of formulas to show cost effectiveness of energy efficient motors in terms of three important parameters: annual energy saving, cost saving and payback periods. This data can be further used to construct confidence bounds for the parameters using statistical techniques as described in [1].

  11. A computationally efficient order statistics based outlier detection technique for EEG signals.

    PubMed

    Giri, Bapun K; Sarkar, Soumajyoti; Mazumder, Satyaki; Das, Koel

    2015-01-01

    Detecting artifacts in EEG data produced by muscle activity, eye blinks and electrical noise is a common and important problem in EEG applications. We present a novel outlier detection method based on order statistics. We propose a 2 step procedure comprising of detecting noisy EEG channels followed by detection of noisy epochs in the outlier channels. The performance of our method is tested systematically using simulated and real EEG data. Our technique produces significant improvement in detecting EEG artifacts over state-of-the-art outlier detection technique used in EEG applications. The proposed method can serve as a general outlier detection tool for different types of noisy signals.

  12. Hierarchical probabilistic regionalization of volcanism for Sengan region in Japan using multivariate statistical techniques and geostatistical interpolation techniques

    SciTech Connect

    Park, Jinyong; Balasingham, P; McKenna, Sean Andrew; Pinnaduwa H.S.W. Kulatilake

    2004-09-01

    Sandia National Laboratories, under contract to Nuclear Waste Management Organization of Japan (NUMO), is performing research on regional classification of given sites in Japan with respect to potential volcanic disruption using multivariate statistics and geo-statistical interpolation techniques. This report provides results obtained for hierarchical probabilistic regionalization of volcanism for the Sengan region in Japan by applying multivariate statistical techniques and geostatistical interpolation techniques on the geologic data provided by NUMO. A workshop report produced in September 2003 by Sandia National Laboratories (Arnold et al., 2003) on volcanism lists a set of most important geologic variables as well as some secondary information related to volcanism. Geologic data extracted for the Sengan region in Japan from the data provided by NUMO revealed that data are not available at the same locations for all the important geologic variables. In other words, the geologic variable vectors were found to be incomplete spatially. However, it is necessary to have complete geologic variable vectors to perform multivariate statistical analyses. As a first step towards constructing complete geologic variable vectors, the Universal Transverse Mercator (UTM) zone 54 projected coordinate system and a 1 km square regular grid system were selected. The data available for each geologic variable on a geographic coordinate system were transferred to the aforementioned grid system. Also the recorded data on volcanic activity for Sengan region were produced on the same grid system. Each geologic variable map was compared with the recorded volcanic activity map to determine the geologic variables that are most important for volcanism. In the regionalized classification procedure, this step is known as the variable selection step. The following variables were determined as most important for volcanism: geothermal gradient, groundwater temperature, heat discharge, groundwater

  13. A comparison of linear and nonlinear statistical techniques in performance attribution.

    PubMed

    Chan, N H; Genovese, C R

    2001-01-01

    Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.

  14. Evaluation of drug-polymer solubility curves through formal statistical analysis: comparison of preparation techniques.

    PubMed

    Knopp, Matthias Manne; Olesen, Niels Erik; Holm, Per; Löbmann, Korbinian; Holm, René; Langguth, Peter; Rades, Thomas

    2015-01-01

    In this study, the influence of the preparation technique (ball milling, spray drying, and film casting) of a supersaturated amorphous dispersion on the quality of solubility determinations of indomethacin in polyvinylpyrrolidone was investigated by means of statistical analysis. After annealing of the amorphous dispersions above the crystallization temperature for 2 h, the solubility curve was derived from the glass transition temperature of the demixed material using the Gordon-Taylor relationship and fitting with the Flory-Huggins model. The study showed that the predicted solubility from the ball-milled mixtures was not consistent with those from spray drying and film casting, indicating fundamental differences between the preparation techniques. Through formal statistical analysis, the best combination of fit to the Flory-Huggins model and reproducibility of the measurements was analyzed. Ball milling provided the best reproducibility of the three preparation techniques; however, an analysis of residuals revealed a systematic error. In contrast, film casting demonstrated a good fit to the model but poor reproducibility of the measurements. Therefore, this study recommends that techniques such as spray drying or potentially film casting (if experimental reproducibility can be improved) should be used to prepare the amorphous dispersions when performing solubility measurements of this kind.

  15. Comparison of statistical clustering techniques for the classification of modelled atmospheric trajectories

    NASA Astrophysics Data System (ADS)

    Kassomenos, P.; Vardoulakis, S.; Borge, R.; Lumbreras, J.; Papaloukas, C.; Karakitsios, S.

    2010-10-01

    In this study, we used and compared three different statistical clustering methods: an hierarchical, a non-hierarchical (K-means) and an artificial neural network technique (self-organizing maps (SOM)). These classification methods were applied to a 4-year dataset of 5 days kinematic back trajectories of air masses arriving in Athens, Greece at 12.00 UTC, in three different heights, above the ground. The atmospheric back trajectories were simulated with the HYSPLIT Vesion 4.7 model of National Oceanic and Atmospheric Administration (NOAA). The meteorological data used for the computation of trajectories were obtained from NOAA reanalysis database. A comparison of the three statistical clustering methods through statistical indices was attempted. It was found that all three statistical methods seem to depend to the arrival height of the trajectories, but the degree of dependence differs substantially. Hierarchical clustering showed the highest level of dependence for fast-moving trajectories to the arrival height, followed by SOM. K-means was found to be the least depended clustering technique on the arrival height. The air quality management applications of these results in relation to PM10 concentrations recorded in Athens, Greece, were also discussed. Differences of PM10 concentrations, during certain clusters, were found statistically different (at 95% confidence level) indicating that these clusters appear to be associated with long-range transportation of particulates. This study can improve the interpretation of modelled atmospheric trajectories, leading to a more reliable analysis of synoptic weather circulation patterns and their impacts on urban air quality.

  16. [Statistical study of the wavelet-based lossy medical image compression technique].

    PubMed

    Puniene, Jūrate; Navickas, Ramūnas; Punys, Vytenis; Jurkevicius, Renaldas

    2002-01-01

    Medical digital images have informational redundancy. Both the amount of memory for image storage and their transmission time could be reduced if image compression techniques are applied. The techniques are divided into two groups: lossless (compression ratio does not exceed 3 times) and lossy ones. Compression ratio of lossy techniques depends on visibility of distortions. It is a variable parameter and it can exceed 20 times. A compression study was performed to evaluate the compression schemes, which were based on the wavelet transform. The goal was to develop a set of recommendations for an acceptable compression ratio for different medical image modalities: ultrasound cardiac images and X-ray angiographic images. The acceptable image quality after compression was evaluated by physicians. Statistical analysis of the evaluation results was used to form a set of recommendations.

  17. Recent advances in sample preparation techniques for effective bioanalytical methods.

    PubMed

    Kole, Prashant Laxman; Venkatesh, Gantala; Kotecha, Jignesh; Sheshala, Ravi

    2011-01-01

    This paper reviews the recent developments in bioanalysis sample preparation techniques and gives an update on basic principles, theory, applications and possibilities for automation, and a comparative discussion on the advantages and limitation of each technique. Conventional liquid-liquid extraction (LLE), protein precipitation (PP) and solid-phase extraction (SPE) techniques are now been considered as methods of the past. The last decade has witnessed a rapid development of novel sample preparation techniques in bioanalysis. Developments in SPE techniques such as selective sorbents and in the overall approach to SPE, such as hybrid SPE and molecularly imprinted polymer SPE, have been addressed. Considerable literature has been published in the area of solid-phase micro-extraction and its different versions, e.g. stir bar sorptive extraction, and their application in the development of selective and sensitive bioanalytical methods. Techniques such as dispersive solid-phase extraction, disposable pipette extraction and micro-extraction by packed sorbent offer a variety of extraction phases and provide unique advantages to bioanalytical methods. On-line SPE utilizing column-switching techniques is rapidly gaining acceptance in bioanalytical applications. PP sample preparation techniques such as PP filter plates/tubes offer many advantages like removal of phospholipids and proteins in plasma/serum. Newer approaches to conventional LLE techniques (salting-out LLE) are also covered in this review article.

  18. Live-site UXO classification studies using advanced EMI and statistical models

    NASA Astrophysics Data System (ADS)

    Shamatava, I.; Shubitidze, F.; Fernandez, J. P.; Bijamov, A.; Barrowes, B. E.; O'Neill, K.

    2011-06-01

    In this paper we present the inversion and classification performance of the advanced EMI inversion, processing and discrimination schemes developed by our group when applied to the ESTCP Live-Site UXO Discrimination Study carried out at the former Camp Butner in North Carolina. The advanced models combine: 1) the joint diagonalization (JD) algorithm to estimate the number of potential anomalies from the measured data without inversion, 2) the ortho-normalized volume magnetic source (ONVMS) to represent targets' EMI responses and extract their intrinsic "feature vectors," and 3) the Gaussian mixture algorithm to classify buried objects as targets of interest or not starting from the extracted discrimination features. The studies are conducted using cued datasets collected with the next-generation TEMTADS and MetalMapper (MM) sensor systems. For the cued TEMTADS datasets we first estimate the data quality and the number of targets contributing to each signal using the JD technique. Once we know the number of targets we proceed to invert the data using a standard non-linear optimization technique in order to determine intrinsic parameters such as the total ONVMS for each potential target. Finally we classify the targets using a library-matching technique. The MetalMapper data are all inverted as multi-target scenarios, and the resulting intrinsic parameters are grouped using an unsupervised Gaussian mixture approach. The potential targets of interest are a 37-mm projectile, an M48 fuze, and a 105-mm projectile. During the analysis we requested the ground truth for a few selected anomalies to assist in the classification task. Our results were scored independently by the Institute for Defense Analyses, who revealed that our advanced models produce superb classification when starting from either TEMTADS or MM cued datasets.

  19. Selecting statistical or machine learning techniques for regional landslide susceptibility modelling by evaluating spatial prediction

    NASA Astrophysics Data System (ADS)

    Goetz, Jason; Brenning, Alexander; Petschko, Helene; Leopold, Philip

    2015-04-01

    With so many techniques now available for landslide susceptibility modelling, it can be challenging to decide on which technique to apply. Generally speaking, the criteria for model selection should be tied closely to end users' purpose, which could be spatial prediction, spatial analysis or both. In our research, we focus on comparing the spatial predictive abilities of landslide susceptibility models. We illustrate how spatial cross-validation, a statistical approach for assessing spatial prediction performance, can be applied with the area under the receiver operating characteristic curve (AUROC) as a prediction measure for model comparison. Several machine learning and statistical techniques are evaluated for prediction in Lower Austria: support vector machine, random forest, bundling with penalized linear discriminant analysis, logistic regression, weights of evidence, and the generalized additive model. In addition to predictive performance, the importance of predictor variables in each model was estimated using spatial cross-validation by calculating the change in AUROC performance when variables are randomly permuted. The susceptibility modelling techniques were tested in three areas of interest in Lower Austria, which have unique geologic conditions associated with landslide occurrence. Overall, we found for the majority of comparisons that there were little practical or even statistically significant differences in AUROCs. That is the models' prediction performances were very similar. Therefore, in addition to prediction, the ability to interpret models for spatial analysis and the qualitative qualities of the prediction surface (map) are considered and discussed. The measure of variable importance provided some insight into the model behaviour for prediction, in particular for "black-box" models. However, there were no clear patterns in all areas of interest to why certain variables were given more importance over others.

  20. High-resolution climate simulations for Central Europe: An assessment of dynamical and statistical downscaling techniques

    NASA Astrophysics Data System (ADS)

    Miksovsky, J.; Huth, R.; Halenka, T.; Belda, M.; Farda, A.; Skalak, P.; Stepanek, P.

    2009-12-01

    To bridge the resolution gap between the outputs of global climate models (GCMs) and finer-scale data needed for studies of the climate change impacts, two approaches are widely used: dynamical downscaling, based on application of regional climate models (RCMs) embedded into the domain of the GCM simulation, and statistical downscaling (SDS), using empirical transfer functions between the large-scale data generated by the GCM and local measurements. In our contribution, we compare the performance of different variants of both techniques for the region of Central Europe. The dynamical downscaling is represented by the outputs of two regional models run in the 10 km horizontal grid, ALADIN-CLIMATE/CZ (co-developed by the Czech Hydrometeorological Institute and Meteo-France) and RegCM3 (developed by the Abdus Salam Centre for Theoretical Physics). The applied statistical methods were based on multiple linear regression, as well as on several of its nonlinear alternatives, including techniques employing artificial neural networks. Validation of the downscaling outputs was carried out using measured data, gathered from weather stations in the Czech Republic, Slovakia, Austria and Hungary for the end of the 20th century; series of daily values of maximum and minimum temperature, precipitation and relative humidity were analyzed. None of the regional models or statistical downscaling techniques could be identified as the universally best one. For instance, while most statistical methods misrepresented the shape of the statistical distribution of the target variables (especially in the more challenging cases such as estimation of daily precipitation), RCM-generated data often suffered from severe biases. It is also shown that further enhancement of the simulated fields of climate variables can be achieved through a combination of dynamical downscaling and statistical postprocessing. This can not only be used to reduce biases and other systematic flaws in the generated time

  1. Advanced rehabilitation techniques for the multi-limb amputee.

    PubMed

    Harvey, Zach T; Loomis, Gregory A; Mitsch, Sarah; Murphy, Ian C; Griffin, Sarah C; Potter, Benjamin K; Pasquina, Paul

    2012-01-01

    Advances in combat casualty care have contributed to unprecedented survival rates of battlefield injuries, challenging the field of rehabilitation to help injured service members achieve maximal functional recovery and independence. Nowhere is this better illustrated than in the care of the multiple-limb amputee. Specialized medical, surgical, and rehabilitative interventions are needed to optimize the care of this unique patient population. This article describes lessons learned at Walter Reed National Military Medical Center Bethesda in providing advanced therapy and prosthetics for combat casualties, but provides guidelines for all providers involved in the care of individuals with amputation.

  2. Advanced froth flotation techniques for fine coal cleaning

    SciTech Connect

    Yoon, R.H.; Luttrell, G.H.

    1994-12-31

    Advanced column flotation cells offer many potential advantages for the treatment of fine coal. The most important of these is the ability to achieve high separation efficiencies using only a single stage of processing. Unfortunately, industrial flotation columns often suffer from poor recovery, low throughput and high maintenance requirements as compared to mechanically-agitated conventional cells. These problems can usually be attributed to poorly-designed air sparging systems. This article examines the problems of air sparging in greater detail and offers useful guidelines for designing bubble generators for industrial flotation columns. The application of these principles in the design of a successful advanced fine coal flotation circuit is also presented.

  3. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Petrova, Irina; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-11-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP)/maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  4. How complex climate networks complement eigen techniques for the statistical analysis of climatological data

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Petrova, Irina; Löw, Alexander; Marwan, Norbert; Kurths, Jürgen

    2015-04-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) / maximum covariance analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network (CN) analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships as well as conceptual differences between both eigen and network approaches are derived and illustrated using global precipitation, evaporation and surface air temperature data sets. These results allow us to pinpoint that CN analysis can complement classical eigen techniques and provides additional information on the higher-order structure of statistical interrelationships in climatological data. Hence, CNs are a valuable supplement to the statistical toolbox of the climatologist, particularly for making sense out of very large data sets such as those generated by satellite observations and climate model intercomparison exercises.

  5. Analytic Syntax: A Technique for Advanced Level Reading

    ERIC Educational Resources Information Center

    Berman, Ruth

    1975-01-01

    The technique explained here can increase a foreign student's awareness of English grammatical and rhetorical structures. Structural paraphrase is a syntactic reformulation of difficult phrases with minimal vocabulary changes. The technique is illustrated and suggestions are given for class presentation. (CHK)

  6. Probabilistic and numerical techniques in the study of statistical theories of turbulence. Final Technical Report

    SciTech Connect

    Ellis, Richard S.; Turkington, B.

    2003-06-09

    In this research project we made fundamental advances in a number of problems arising in statistical equilibrium theories of turbulence. Here are the highlights. In most cases the mathematical analysis was supplemented by numerical calculations. (a) Maximum entropy principles. We analyzed in a unified framework the Miller-Robert continuum model of equilibrium states in an ideal fluid and a modification of that model due to Turkington. (b) Equivalence and nonequivalence of ensembles. We gave a complete analysis of the equivalence and nonequivalence of the microcanonical, canonical, and mixed ensembles at the level of equilibrium macrostates for a large class of models of turbulence. (c) Nonlinear stability of flows. We refined the well known Arnold stability theorems by proving the nonlinear stability of steady mean flows for the quasi-geostrophic potential vorticity equation in the case when the ensembles are nonequivalent. (d) Geophysical application. The theories developed in items (a), (b), and (c) were applied to predict the emergence and persistence of coherent structures in the active weather layer of the Jovian atmosphere. This is the first work in which sophisticated statistical theories are synthesized with actual observations data from the Voyager and Galileo space missions. (e) Nonlinear dispersive waves. For a class of nonlinear Schroedinger equations we demonstrated that the self-organization of solutions into a ground-state solitary wave immersed in fine-scale fluctuations is a relaxation into statistical equilibrium.

  7. A semiautomated electron backscatter diffraction technique for extracting reliable twin statistics

    NASA Astrophysics Data System (ADS)

    Henrie, B. L.; Mason, T. A.; Hansen, B. L.

    2004-12-01

    A framework has been developed for extracting reliable twin statistics from a deformed microstructure using crystallographic twin identification techniques with spatially correlated electron backscatter diffraction (EBSD) data. The key features of this analysis are the use of the mathematical definition of twin relationships, the inclination of the common K 1 plane at a twin boundary, and the correct identification of the parent orientation in a parent/twin pair. Methods for identifying the parent in a parent/twin pair will be briefly discussed and compared. Twin area fractions are then categorized by operative twin systems, number of active twin variants in each system, and corrected twin widths. These statistics are reported here for α-zirconium samples deformed in quasi-static four-point bend beams and in a 100 m/s Taylor cylinder impact test. Analysis of the statistics also begins to reveal the roles that deformation rate and relative orientation of the boundary conditions to the material’s symmetry axes play in determining the twinning activity that accommodates the imposed boundary conditions. These improved twin statistics can help quantify the deformation processes in materials that deform by twinning as well as serve to provide better validation of proposed models of the deformation processes.

  8. Recent advances in microscopic techniques for visualizing leukocytes in vivo

    PubMed Central

    Jain, Rohit; Tikoo, Shweta; Weninger, Wolfgang

    2016-01-01

    Leukocytes are inherently motile and interactive cells. Recent advances in intravital microscopy approaches have enabled a new vista of their behavior within intact tissues in real time. This brief review summarizes the developments enabling the tracking of immune responses in vivo. PMID:27239292

  9. Bricklaying Curriculum: Advanced Bricklaying Techniques. Instructional Materials. Revised.

    ERIC Educational Resources Information Center

    Turcotte, Raymond J.; Hendrix, Laborn J.

    This curriculum guide is designed to assist bricklaying instructors in providing performance-based instruction in advanced bricklaying. Included in the first section of the guide are units on customized or architectural masonry units; glass block; sills, lintels, and copings; and control (expansion) joints. The next two units deal with cut,…

  10. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  11. Application of statistical downscaling technique for the production of wine grapes (Vitis vinifera L.) in Spain

    NASA Astrophysics Data System (ADS)

    Gaitán Fernández, E.; García Moreno, R.; Pino Otín, M. R.; Ribalaygua Batalla, J.

    2012-04-01

    Climate and soil are two of the most important limiting factors for agricultural production. Nowadays climate change has been documented in many geographical locations affecting different cropping systems. The General Circulation Models (GCM) has become important tools to simulate the more relevant aspects of the climate expected for the XXI century in the frame of climatic change. These models are able to reproduce the general features of the atmospheric dynamic but their low resolution (about 200 Km) avoids a proper simulation of lower scale meteorological effects. Downscaling techniques allow overcoming this problem by adapting the model outcomes to local scale. In this context, FIC (Fundación para la Investigación del Clima) has developed a statistical downscaling technique based on a two step analogue methods. This methodology has been broadly tested on national and international environments leading to excellent results on future climate models. In a collaboration project, this statistical downscaling technique was applied to predict future scenarios for the grape growing systems in Spain. The application of such model is very important to predict expected climate for the different growing crops, mainly for grape, where the success of different varieties are highly related to climate and soil. The model allowed the implementation of agricultural conservation practices in the crop production, detecting highly sensible areas to negative impacts produced by any modification of climate in the different regions, mainly those protected with protected designation of origin, and the definition of new production areas with optimal edaphoclimatic conditions for the different varieties.

  12. Multivariate statistical techniques for the assessment of seasonal variations in surface water quality of pasture ecosystems.

    PubMed

    Ajorlo, Majid; Abdullah, Ramdzani B; Yusoff, Mohd Kamil; Halim, Ridzwan Abd; Hanif, Ahmad Husni Mohd; Willms, Walter D; Ebrahimian, Mahboubeh

    2013-10-01

    This study investigates the applicability of multivariate statistical techniques including cluster analysis (CA), discriminant analysis (DA), and factor analysis (FA) for the assessment of seasonal variations in the surface water quality of tropical pastures. The study was carried out in the TPU catchment, Kuala Lumpur, Malaysia. The dataset consisted of 1-year monitoring of 14 parameters at six sampling sites. The CA yielded two groups of similarity between the sampling sites, i.e., less polluted (LP) and moderately polluted (MP) at temporal scale. Fecal coliform (FC), NO3, DO, and pH were significantly related to the stream grouping in the dry season, whereas NH3, BOD, Escherichia coli, and FC were significantly related to the stream grouping in the rainy season. The best predictors for distinguishing clusters in temporal scale were FC, NH3, and E. coli, respectively. FC, E. coli, and BOD with strong positive loadings were introduced as the first varifactors in the dry season which indicates the biological source of variability. EC with a strong positive loading and DO with a strong negative loading were introduced as the first varifactors in the rainy season, which represents the physiochemical source of variability. Multivariate statistical techniques were effective analytical techniques for classification and processing of large datasets of water quality and the identification of major sources of water pollution in tropical pastures.

  13. Field Penetration in a Rectangular Box Using Numerical Techniques: An Effort to Obtain Statistical Shielding Effectiveness

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.; Yu, Shih-Pin

    2006-01-01

    This paper emphasizes the application of numerical methods to explore the ideas related to shielding effectiveness from a statistical view. An empty rectangular box is examined using a hybrid modal/moment method. The basic computational method is presented followed by the results for single- and multiple observation points within the over-moded empty structure. The statistics of the field are obtained by using frequency stirring, borrowed from the ideas connected with reverberation chamber techniques, and extends the ideas of shielding effectiveness well into the multiple resonance regions. The study presented in this paper will address the average shielding effectiveness over a broad spatial sample within the enclosure as the frequency is varied.

  14. Raman spectroscopy coupled with advanced statistics for differentiating menstrual and peripheral blood.

    PubMed

    Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K

    2014-01-01

    Body fluids are a common and important type of forensic evidence. In particular, the identification of menstrual blood stains is often a key step during the investigation of rape cases. Here, we report on the application of near-infrared Raman microspectroscopy for differentiating menstrual blood from peripheral blood. We observed that the menstrual and peripheral blood samples have similar but distinct Raman spectra. Advanced statistical analysis of the multiple Raman spectra that were automatically (Raman mapping) acquired from the 40 dried blood stains (20 donors for each group) allowed us to build classification model with maximum (100%) sensitivity and specificity. We also demonstrated that despite certain common constituents, menstrual blood can be readily distinguished from vaginal fluid. All of the classification models were verified using cross-validation methods. The proposed method overcomes the problems associated with currently used biochemical methods, which are destructive, time consuming and expensive.

  15. Backscattered Electron Microscopy as an Advanced Technique in Petrography.

    ERIC Educational Resources Information Center

    Krinsley, David Henry; Manley, Curtis Robert

    1989-01-01

    Three uses of this method with sandstone, desert varnish, and granite weathering are described. Background information on this technique is provided. Advantages of this type of microscopy are stressed. (CW)

  16. Assessing Fire Weather Index using statistical downscaling and spatial interpolation techniques in Greece

    NASA Astrophysics Data System (ADS)

    Karali, Anna; Giannakopoulos, Christos; Frias, Maria Dolores; Hatzaki, Maria; Roussos, Anargyros; Casanueva, Ana

    2013-04-01

    Forest fires have always been present in the Mediterranean ecosystems, thus they constitute a major ecological and socio-economic issue. The last few decades though, the number of forest fires has significantly increased, as well as their severity and impact on the environment. Local fire danger projections are often required when dealing with wild fire research. In the present study the application of statistical downscaling and spatial interpolation methods was performed to the Canadian Fire Weather Index (FWI), in order to assess forest fire risk in Greece. The FWI is used worldwide (including the Mediterranean basin) to estimate the fire danger in a generalized fuel type, based solely on weather observations. The meteorological inputs to the FWI System are noon values of dry-bulb temperature, air relative humidity, 10m wind speed and precipitation during the previous 24 hours. The statistical downscaling methods are based on a statistical model that takes into account empirical relationships between large scale variables (used as predictors) and local scale variables. In the framework of the current study the statistical downscaling portal developed by the Santander Meteorology Group (https://www.meteo.unican.es/downscaling) in the framework of the EU project CLIMRUN (www.climrun.eu) was used to downscale non standard parameters related to forest fire risk. In this study, two different approaches were adopted. Firstly, the analogue downscaling technique was directly performed to the FWI index values and secondly the same downscaling technique was performed indirectly through the meteorological inputs of the index. In both cases, the statistical downscaling portal was used considering the ERA-Interim reanalysis as predictands due to the lack of observations at noon. Additionally, a three-dimensional (3D) interpolation method of position and elevation, based on Thin Plate Splines (TPS) was used, to interpolate the ERA-Interim data used to calculate the index

  17. Electroextraction and electromembrane extraction: Advances in hyphenation to analytical techniques

    PubMed Central

    Oedit, Amar; Ramautar, Rawi; Hankemeier, Thomas

    2016-01-01

    Electroextraction (EE) and electromembrane extraction (EME) are sample preparation techniques that both require an electric field that is applied over a liquid‐liquid system, which enables the migration of charged analytes. Furthermore, both techniques are often used to pre‐concentrate analytes prior to analysis. In this review an overview is provided of the body of literature spanning April 2012–November 2015 concerning EE and EME, focused on hyphenation to analytical techniques. First, the theoretical aspects of concentration enhancement in EE and EME are discussed to explain extraction recovery and enrichment factor. Next, overviews are provided of the techniques based on their hyphenation to LC, GC, CE, and direct detection. These overviews cover the compounds and matrices, experimental aspects (i.e. donor volume, acceptor volume, extraction time, extraction voltage, and separation time) and the analytical aspects (i.e. limit of detection, enrichment factor, and extraction recovery). Techniques that were either hyphenated online to analytical techniques or show high potential with respect to online hyphenation are highlighted. Finally, the potential future directions of EE and EME are discussed. PMID:26864699

  18. Advanced millimeter-wave security portal imaging techniques

    NASA Astrophysics Data System (ADS)

    Sheen, David M.; Bernacki, Bruce E.; McMakin, Douglas L.

    2012-03-01

    Millimeter-wave (mm-wave) imaging is rapidly gaining acceptance as a security tool to augment conventional metal detectors and baggage x-ray systems for passenger screening at airports and other secured facilities. This acceptance indicates that the technology has matured; however, many potential improvements can yet be realized. The authors have developed a number of techniques over the last several years including novel image reconstruction and display techniques, polarimetric imaging techniques, array switching schemes, and high-frequency high-bandwidth techniques. All of these may improve the performance of new systems; however, some of these techniques will increase the cost and complexity of the mm-wave security portal imaging systems. Reducing this cost may require the development of novel array designs. In particular, RF photonic methods may provide new solutions to the design and development of the sequentially switched linear mm-wave arrays that are the key element in the mm-wave portal imaging systems. Highfrequency, high-bandwidth designs are difficult to achieve with conventional mm-wave electronic devices, and RF photonic devices may be a practical alternative. In this paper, the mm-wave imaging techniques developed at PNNL are reviewed and the potential for implementing RF photonic mm-wave array designs is explored.

  19. Coal and char studies by advanced EMR techniques

    SciTech Connect

    Belford, R.L.; Clarkson, R.B.; Odintsov, B.M.

    1999-03-31

    Advanced magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. During this grant period, further progress was made on proton NMR and low-frequency dynamic nuclear polarization (DNP) to examine the interaction between fluids such as water and the surface of suspended char particles. Effects of char particle size and type on water nuclear spin relaxation, T2, were measured and modeled.

  20. Coal and char studies by advanced EMR techniques

    SciTech Connect

    Belford, R.L.; Clarkson, R.B.; Odintsov, B.M.

    1998-09-30

    Advanced magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. During this grant period, further progress was made on proton NMR and low-frequency dynamic nuclear polarization (DNP) to examine the interaction between fluids such as water and the surface of suspended char particles. Effects of char particle size on water nuclear spin relaxation, T2, were measured.

  1. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  2. Monte Carlo Simulations in Statistical Physics -- From Basic Principles to Advanced Applications

    NASA Astrophysics Data System (ADS)

    Janke, Wolfhard

    2013-08-01

    This chapter starts with an overview of Monte Carlo computer simulation methodologies which are illustrated for the simple case of the Ising model. After reviewing importance sampling schemes based on Markov chains and standard local update rules (Metropolis, Glauber, heat-bath), nonlocal cluster-update algorithms are explained which drastically reduce the problem of critical slowing down at second-order phase transitions and thus improve the performance of simulations. How this can be quantified is explained in the section on statistical error analyses of simulation data including the effect of temporal correlations and autocorrelation times. Histogram reweighting methods are explained in the next section. Eventually, more advanced generalized ensemble methods (simulated and parallel tempering, multicanonical ensemble, Wang-Landau method) are discussed which are particularly important for simulations of first-order phase transitions and, in general, of systems with rare-event states. The setup of scaling and finite-size scaling analyses is the content of the following section. The chapter concludes with two advanced applications to complex physical systems. The first example deals with a quenched, diluted ferromagnet, and in the second application we consider the adsorption properties of macromolecules such as polymers and proteins to solid substrates. Such systems often require especially tailored algorithms for their efficient and successful simulation.

  3. A statistical technique for processing radio interferometer data. [using maximum likelihood algorithm

    NASA Technical Reports Server (NTRS)

    Papadopoulos, G. D.

    1975-01-01

    The output of a radio interferometer is the Fourier transform of the object under investigation. Due to the limited coverage of the Fourier plane, the reconstruction of the image of the source is blurred by the beam of the synthesized array. A maximum-likelihood processing technique is described which uses the statistical properties of the received noise-like signals. This technique has been used extensively in the processing of large-aperture seismic arrays. This inversion method results in a synthesized beam that is more uniform, has lower sidelobes, and higher resolution than the normal Fourier transform methods. The maximum-likelihood method algorithm was applied successfully to very long baseline and short baseline interferometric data.

  4. Advanced Weapon System (AWS) Sensor Prediction Techniques Study. Volume I

    DTIC Science & Technology

    1981-09-01

    of the study. The material that was researched generally fell within the following topics or categories: 0 Perceptual psychology in visual training...the first stage of the process. Bp appears to offer a reasonable summary of the statistics of the second stage. S3. 1. 2.2.4 Bombing Mod’als lRandom...INTRODUCTION The principal object of this research was to study effiLient methods of representation. storage, and display of sensor scene simulation

  5. Performance of Statistical Temporal Downscaling Techniques of Wind Speed Data Over Aegean Sea

    NASA Astrophysics Data System (ADS)

    Gokhan Guler, Hasan; Baykal, Cuneyt; Ozyurt, Gulizar; Kisacik, Dogan

    2016-04-01

    Wind speed data is a key input for many meteorological and engineering applications. Many institutions provide wind speed data with temporal resolutions ranging from one hour to twenty four hours. Higher temporal resolution is generally required for some applications such as reliable wave hindcasting studies. One solution to generate wind data at high sampling frequencies is to use statistical downscaling techniques to interpolate values of the finer sampling intervals from the available data. In this study, the major aim is to assess temporal downscaling performance of nine statistical interpolation techniques by quantifying the inherent uncertainty due to selection of different techniques. For this purpose, hourly 10-m wind speed data taken from 227 data points over Aegean Sea between 1979 and 2010 having a spatial resolution of approximately 0.3 degrees are analyzed from the National Centers for Environmental Prediction (NCEP) The Climate Forecast System Reanalysis database. Additionally, hourly 10-m wind speed data of two in-situ measurement stations between June, 2014 and June, 2015 are considered to understand effect of dataset properties on the uncertainty generated by interpolation technique. In this study, nine statistical interpolation techniques are selected as w0 (left constant) interpolation, w6 (right constant) interpolation, averaging step function interpolation, linear interpolation, 1D Fast Fourier Transform interpolation, 2nd and 3rd degree Lagrange polynomial interpolation, cubic spline interpolation, piecewise cubic Hermite interpolating polynomials. Original data is down sampled to 6 hours (i.e. wind speeds at 0th, 6th, 12th and 18th hours of each day are selected), then 6 hourly data is temporally downscaled to hourly data (i.e. the wind speeds at each hour between the intervals are computed) using nine interpolation technique, and finally original data is compared with the temporally downscaled data. A penalty point system based on

  6. Pilot-scale investigation of drinking water ultrafiltration membrane fouling rates using advanced data analysis techniques.

    PubMed

    Chen, Fei; Peldszus, Sigrid; Peiris, Ramila H; Ruhl, Aki S; Mehrez, Renata; Jekel, Martin; Legge, Raymond L; Huck, Peter M

    2014-01-01

    A pilot-scale investigation of the performance of biofiltration as a pre-treatment to ultrafiltration for drinking water treatment was conducted between 2008 and 2010. The objective of this study was to further understand the fouling behaviour of ultrafiltration at pilot scale and assess the utility of different foulant monitoring tools. Various fractions of natural organic matter (NOM) and colloidal/particulate matter of raw water, biofilter effluents, and membrane permeate were characterized by employing two advanced NOM characterization techniques: liquid chromatography - organic carbon detection (LC-OCD) and fluorescence excitation-emission matrices (FEEM) combined with principal component analysis (PCA). A framework of fouling rate quantification and classification was also developed and utilized in this study. In cases such as the present one where raw water quality and therefore fouling potential vary substantially, such classification can be considered essential for proper data interpretation. The individual and combined contributions of various NOM fractions and colloidal/particulate matter to hydraulically reversible and irreversible fouling were investigated using various multivariate statistical analysis techniques. Protein-like substances and biopolymers were identified as major contributors to both reversible and irreversible fouling, whereas colloidal/particulate matter can alleviate the extent of irreversible fouling. Humic-like substances contributed little to either reversible or irreversible fouling at low level fouling rates. The complementary nature of FEEM-PCA and LC-OCD for assessing the fouling potential of complex water matrices was also illustrated by this pilot-scale study.

  7. Application of Active Learning Techniques to an Advanced Course

    NASA Astrophysics Data System (ADS)

    Knop, R. A.

    2004-05-01

    The New Faculty Workshop provided a wealth of techniques as well as an overriding philosophy for the teaching of undergraduate Physics and Astronomy courses. The focus of the workshop was active learning, summarized in ``Learner-Centered Astronomy Teaching" by Slater & Adams: it's not what you do in class that matters, it's what the students do. Much of the specific focus of the New Faculty Workshop is on teaching the large, introductory Physics classes that many of the faculty present are sure to teach, both algebra-based and calculus-based. Many of these techniques apply directly and with little modification to introductory Astronomy courses. However, little direct attention is given to upper-division undergraduate, or even graduate, courses. In this presentation, I will share my experience in attempting to apply some of the techniques discussed at the New Faculty Workshop to an upper-division course in Galactic Astrophysics at Vanderbilt University during the Spring semester of 2004.

  8. The bumper technique for advancing a large profile microcatheter.

    PubMed

    Kellner, Christopher P; Chartrain, Alexander G; Schwegel, Claire; Oxley, Thomas J; Shoirah, Hazem; Mocco, J

    2017-03-09

    Operators commonly encounter difficulty maneuvering a microcatheter beyond the distal lip of wide neck aneurysms and aneurysms in challenging locations. Few techniques have been described to guide operators in these particular situations. In this case report of a 56-year-old woman with a 16 mm ophthalmic artery aneurysm, the microcatheter continually snagged the distal aneurysm lip, preventing delivery of a flow diverter into the distal parent vessel. In troubleshooting this obstacle, a second microguidewire was introduced alongside the microcatheter and was used to cover the distal lip of the aneurysm to prevent further snagging. The second guidewire successfully deflected the microcatheter into the distal vessel, a technique that we have aptly dubbed the 'bumper technique'.

  9. Nondestructive Evaluation of Thick Concrete Using Advanced Signal Processing Techniques

    SciTech Connect

    Clayton, Dwight A; Barker, Alan M; Santos-Villalobos, Hector J; Albright, Austin P; Hoegh, Kyle; Khazanovich, Lev

    2015-09-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years [1]. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the various nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.

  10. A Superposition Technique for Deriving Photon Scattering Statistics in Plane-Parallel Cloudy Atmospheres

    NASA Technical Reports Server (NTRS)

    Platnick, S.

    1999-01-01

    Photon transport in a multiple scattering medium is critically dependent on scattering statistics, in particular the average number of scatterings. A superposition technique is derived to accurately determine the average number of scatterings encountered by reflected and transmitted photons within arbitrary layers in plane-parallel, vertically inhomogeneous clouds. As expected, the resulting scattering number profiles are highly dependent on cloud particle absorption and solar/viewing geometry. The technique uses efficient adding and doubling radiative transfer procedures, avoiding traditional time-intensive Monte Carlo methods. Derived superposition formulae are applied to a variety of geometries and cloud models, and selected results are compared with Monte Carlo calculations. Cloud remote sensing techniques that use solar reflectance or transmittance measurements generally assume a homogeneous plane-parallel cloud structure. The scales over which this assumption is relevant, in both the vertical and horizontal, can be obtained from the superposition calculations. Though the emphasis is on photon transport in clouds, the derived technique is applicable to any scattering plane-parallel radiative transfer problem, including arbitrary combinations of cloud, aerosol, and gas layers in the atmosphere.

  11. Transcranial Doppler: Techniques and advanced applications: Part 2

    PubMed Central

    Sharma, Arvind K.; Bathala, Lokesh; Batra, Amit; Mehndiratta, Man Mohan; Sharma, Vijay K.

    2016-01-01

    Transcranial Doppler (TCD) is the only diagnostic tool that can provide continuous information about cerebral hemodynamics in real time and over extended periods. In the previous paper (Part 1), we have already presented the basic ultrasound physics pertaining to TCD, insonation methods, and various flow patterns. This article describes various advanced applications of TCD such as detection of right-to-left shunt, emboli monitoring, vasomotor reactivity (VMR), monitoring of vasospasm in subarachnoid hemorrhage (SAH), monitoring of intracranial pressure, its role in stoke prevention in sickle cell disease, and as a supplementary test for confirmation of brain death. PMID:27011639

  12. Brain development in preterm infants assessed using advanced MRI techniques.

    PubMed

    Tusor, Nora; Arichi, Tomoki; Counsell, Serena J; Edwards, A David

    2014-03-01

    Infants who are born preterm have a high incidence of neurocognitive and neurobehavioral abnormalities, which may be associated with impaired brain development. Advanced magnetic resonance imaging (MRI) approaches, such as diffusion MRI (d-MRI) and functional MRI (fMRI), provide objective and reproducible measures of brain development. Indices derived from d-MRI can be used to provide quantitative measures of preterm brain injury. Although fMRI of the neonatal brain is currently a research tool, future studies combining d-MRI and fMRI have the potential to assess the structural and functional properties of the developing brain and its response to injury.

  13. Application of advanced coating techniques to rocket engine components

    NASA Technical Reports Server (NTRS)

    Verma, S. K.

    1988-01-01

    The materials problem in the space shuttle main engine (SSME) is reviewed. Potential coatings and the method of their application for improved life of SSME components are discussed. A number of advanced coatings for turbine blade components and disks are being developed and tested in a multispecimen thermal fatigue fluidized bed facility at IIT Research Institute. This facility is capable of producing severe strains of the degree present in blades and disk components of the SSME. The potential coating systems and current efforts at IITRI being taken for life extension of the SSME components are summarized.

  14. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  15. Magnetic entropy change calculated from first principles based statistical sampling technique: Ni2 MnGa

    NASA Astrophysics Data System (ADS)

    Odbadrakh, Khorgolkhuu; Nicholson, Don; Eisenbach, Markus; Brown, Gregory; Rusanu, Aurelian; Materials Theory Group Team

    2014-03-01

    Magnetic entropy change in Magneto-caloric Effect materials is one of the key parameters in choosing materials appropriate for magnetic cooling and offers insight into the coupling between the materials' thermodynamic and magnetic degrees of freedoms. We present computational workflow to calculate the change of magnetic entropy due to a magnetic field using the DFT based statistical sampling of the energy landscape of Ni2MnGa. The statistical density of magnetic states is calculated with Wang-Landau sampling, and energies are calculated with the Locally Self-consistent Multiple Scattering technique. The high computational cost of calculating energies of each state from first principles is tempered by exploiting a model Hamiltonian fitted to the DFT based sampling. The workflow is described and justified. The magnetic adiabatic temperature change calculated from the statistical density of states agrees with the experimentally obtained value in the absence of structural transformation. The study also reveals that the magnetic subsystem alone cannot explain the large MCE observed in Ni2MnGa alloys. This work was performed at the ORNL, which is managed by UT-Batelle for the U.S. DOE. It was sponsored by the Division of Material Sciences and Engineering, OBES. This research used resources of the OLCF at ORNL, which is supported by the Office of Science of the U.S. DOE under Contract DE-AC05-00OR22725.

  16. Application of a statistical bootstrapping technique to calculate growth rate variance for modelling psychrotrophic pathogen growth.

    PubMed

    Schaffner, D W

    1994-12-01

    The inherent variability or 'variance' of growth rate measurements is critical to the development of accurate predictive models in food microbiology. A large number of measurements are typically needed to estimate variance. To make these measurements requires a significant investment of time and effort. If a single growth rate determination is based on a series of independent measurements, then a statistical bootstrapping technique can be used to simulate multiple growth rate measurements from a single set of experiments. Growth rate variances were calculated for three large datasets (Listeria monocytogenes, Listeria innocua, and Yersinia enterocolitica) from our laboratory using this technique. This analysis revealed that the population of growth rate measurements at any given condition are not normally distributed, but instead follow a distribution that is between normal and Poisson. The relationship between growth rate and temperature was modeled by response surface models using generalized linear regression. It was found that the assumed distribution (i.e. normal, Poisson, gamma or inverse normal) of the growth rates influenced the prediction of each of the models used. This research demonstrates the importance of variance and assumptions about the statistical distribution of growth rates on the results of predictive microbiological models.

  17. Mathematical and Statistical Techniques for Systems Medicine: The Wnt Signaling Pathway as a Case Study.

    PubMed

    MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M

    2016-01-01

    The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.

  18. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  19. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  20. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  1. Advances in reduction techniques for tire contact problems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1995-01-01

    Some recent developments in reduction techniques, as applied to predicting the tire contact response and evaluating the sensitivity coefficients of the different response quantities, are reviewed. The sensitivity coefficients measure the sensitivity of the contact response to variations in the geometric and material parameters of the tire. The tire is modeled using a two-dimensional laminated anisotropic shell theory with the effects of variation in geometric and material parameters, transverse shear deformation, and geometric nonlinearities included. The contact conditions are incorporated into the formulation by using a perturbed Lagrangian approach with the fundamental unknowns consisting of the stress resultants, the generalized displacements, and the Lagrange multipliers associated with the contact conditions. The elemental arrays are obtained by using a modified two-field, mixed variational principle. For the application of reduction techniques, the tire finite element model is partitioned into two regions. The first region consists of the nodes that are likely to come in contact with the pavement, and the second region includes all the remaining nodes. The reduction technique is used to significantly reduce the degrees of freedom in the second region. The effectiveness of the computational procedure is demonstrated by a numerical example of the frictionless contact response of the space shuttle nose-gear tire, inflated and pressed against a rigid flat surface. Also, the research topics which have high potential for enhancing the effectiveness of reduction techniques are outlined.

  2. In Situ Techniques for Monitoring Electrochromism: An Advanced Laboratory Experiment

    ERIC Educational Resources Information Center

    Saricayir, Hakan; Uce, Musa; Koca, Atif

    2010-01-01

    This experiment employs current technology to enhance and extend existing lab content. The basic principles of spectroscopic and electroanalytical techniques and their use in determining material properties are covered in some detail in many undergraduate chemistry programs. However, there are limited examples of laboratory experiments with in…

  3. Advances in High-Fidelity Multi-Physics Simulation Techniques

    DTIC Science & Technology

    2008-01-01

    fluid dynamics with other disciplines also yield a large and typically stiff equation set whose numerical solution mandates the development and...and Electromagnetics . . . . . 3 2.1 Governing Equations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 2.2 Numerical Technique...discrete equivalent of the governing equations . Thus, the values of the solution vector are localized in a pointwise sense at each node of the mesh. This

  4. Application of the Statistical ICA Technique in the DANCE Data Analysis

    NASA Astrophysics Data System (ADS)

    Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration

    2015-10-01

    The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.

  5. Error Evaluation of Planetary Atmospheric Motion Vectors by Statistical Presumption Technique.

    NASA Astrophysics Data System (ADS)

    MURACHI, T.; IMAMURA, T.; HIGUCHI, T.; NAKAMURA, M.

    2002-05-01

    AMVs. I think that the error of AMVs depends on the image spatial resolution and the change of cloud shape largely not on the spatial and time fluctuation of AMVs. By the change of cloud shape we can_ft track the cloud motion precisely. By the image spatial resolution the error of 1 pixel is attached to the AMVs. In this study, by statistical presumption technique I evaluate the error of AMVs, which reflects influence of the image spatial resolution and the change of cloud shape upon AMVs. In order to verify this method, I make test patterns, and calculate the AMVs and the error with error evaluation by statistical presumption technique using these patterns. This result is that the error of AMVs by this method is almost equivalent to the expected error of the change of cloud shape, and relies on the change rate of cloud shape. In addition, using Venus' cloud images I calculate the AMVs and the error with error evaluation by statistical presumption technique and the error with past error evaluation, which calculate the standard deviation of neighboring AMVs. The result is that the error by statistical presumption technique is smaller than the error of past. In this study, by statistical presumption technique the error evaluation of AMVs that reflects influence of the image spatial resolution and the change of cloud shape upon AMVs is established.

  6. Single Molecule Techniques for Advanced in situ Hybridization

    SciTech Connect

    Hollars, C W; Stubbs, L; Carlson, K; Lu, X; Wehri, E

    2003-02-03

    One of the most significant achievements of modern science is completion of the human genome sequence, completed in the year 2000. Despite this monumental accomplishment, researchers have only begun to understand the relationships between this three-billion-nucleotide genetic code and the regulation and control of gene and protein expression within each of the millions of different types of highly specialized cells. Several methodologies have been developed for the analysis of gene and protein expression in situ, yet despite these advancements, the pace of such analyses is extremely limited. Because information regarding the precise timing and location of gene expression is a crucial component in the discovery of new pharmacological agents for the treatment of disease, there is an enormous incentive to develop technologies that accelerate the analytical process. Here we report on the use of plasmon resonant particles as advanced probes for in situ hybridization. These probes are used for the detection of low levels of gene-probe response and demonstrate a detection method that enables precise, simultaneous localization within a cell of the points of expression of multiple genes or proteins in a single sample.

  7. A Modified Moore Approach to Teaching Mathematical Statistics: An Inquiry Based Learning Technique to Teaching Mathematical Statistics

    ERIC Educational Resources Information Center

    McLoughlin, M. Padraig M. M.

    2008-01-01

    The author of this paper submits the thesis that learning requires doing; only through inquiry is learning achieved, and hence this paper proposes a programme of use of a modified Moore method in a Probability and Mathematical Statistics (PAMS) course sequence to teach students PAMS. Furthermore, the author of this paper opines that set theory…

  8. Advanced optical techniques for monitoring dosimetric parameters in photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Li, Buhong; Qiu, Zhihai; Huang, Zheng

    2012-12-01

    Photodynamic therapy (PDT) is based on the generation of highly reactive singlet oxygen through interactions of photosensitizer, light and molecular oxygen. PDT has become a clinically approved, minimally invasive therapeutic modality for a wide variety of malignant and nonmalignant diseases. The main dosimetric parameters for predicting the PDT efficacy include the delivered light dose, the quantification and photobleaching of the administrated photosensitizer, the tissue oxygen concentration, the amount of singlet oxygen generation and the resulting biological responses. This review article presents the emerging optical techniques that in use or under development for monitoring dosimetric parameters during PDT treatment. Moreover, the main challenges in developing real-time and noninvasive optical techniques for monitoring dosimetric parameters in PDT will be described.

  9. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  10. Advance techniques for monitoring human tolerance to +Gz accelerations.

    NASA Technical Reports Server (NTRS)

    Pelligra, R.; Sandler, H.; Rositano, S.; Skrettingland, K.; Mancini, R.

    1972-01-01

    Standard techniques for monitoring the acceleration-stressed human subject have been augmented by measuring (1) temporal, brachial and/or radial arterial blood flow, and (2) indirect systolic and diastolic blood pressure at 60-sec intervals. Results show that the response of blood pressure to positive accelerations is complex and dependent on an interplay of hydrostatic forces, diminishing venous return, redistribution of blood, and other poorly defined compensatory reflexes.

  11. Advanced techniques for characterization of ion beam modified materials

    DOE PAGES

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; ...

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiationmore » effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.« less

  12. Advanced techniques for characterization of ion beam modified materials

    SciTech Connect

    Zhang, Yanwen; Debelle, Aurélien; Boulle, Alexandre; Kluth, Patrick; Tuomisto, Filip

    2014-10-30

    Understanding the mechanisms of damage formation in materials irradiated with energetic ions is essential for the field of ion-beam materials modification and engineering. Utilizing incident ions, electrons, photons, and positrons, various analysis techniques, including Rutherford backscattering spectrometry (RBS), electron RBS, Raman spectroscopy, high-resolution X-ray diffraction, small-angle X-ray scattering, and positron annihilation spectroscopy, are routinely used or gaining increasing attention in characterizing ion beam modified materials. The distinctive information, recent developments, and some perspectives in these techniques are reviewed in this paper. Applications of these techniques are discussed to demonstrate their unique ability for studying ion-solid interactions and the corresponding radiation effects in modified depths ranging from a few nm to a few tens of μm, and to provide information on electronic and atomic structure of the materials, defect configuration and concentration, as well as phase stability, amorphization and recrystallization processes. Finally, such knowledge contributes to our fundamental understanding over a wide range of extreme conditions essential for enhancing material performance and also for design and synthesis of new materials to address a broad variety of future energy applications.

  13. Development of processing techniques for advanced thermal protection materials

    NASA Technical Reports Server (NTRS)

    Selvaduray, Guna S.

    1994-01-01

    The effort, which was focused on the research and development of advanced materials for use in Thermal Protection Systems (TPS), has involved chemical and physical testing of refractory ceramic tiles, fabrics, threads and fibers. This testing has included determination of the optical properties, thermal shock resistance, high temperature dimensional stability, and tolerance to environmental stresses. Materials have also been tested in the Arc Jet 2 x 9 Turbulent Duct Facility (TDF), the 1 atmosphere Radiant Heat Cycler, and the Mini-Wind Tunnel Facility (MWTF). A significant part of the effort hitherto has gone towards modifying and upgrading the test facilities so that meaningful tests can be carried out. Another important effort during this period has been the creation of a materials database. Computer systems administration and support have also been provided. These are described in greater detail below.

  14. Recent advances in bioprinting techniques: approaches, applications and future prospects.

    PubMed

    Li, Jipeng; Chen, Mingjiao; Fan, Xianqun; Zhou, Huifang

    2016-09-20

    Bioprinting technology shows potential in tissue engineering for the fabrication of scaffolds, cells, tissues and organs reproducibly and with high accuracy. Bioprinting technologies are mainly divided into three categories, inkjet-based bioprinting, pressure-assisted bioprinting and laser-assisted bioprinting, based on their underlying printing principles. These various printing technologies have their advantages and limitations. Bioprinting utilizes biomaterials, cells or cell factors as a "bioink" to fabricate prospective tissue structures. Biomaterial parameters such as biocompatibility, cell viability and the cellular microenvironment strongly influence the printed product. Various printing technologies have been investigated, and great progress has been made in printing various types of tissue, including vasculature, heart, bone, cartilage, skin and liver. This review introduces basic principles and key aspects of some frequently used printing technologies. We focus on recent advances in three-dimensional printing applications, current challenges and future directions.

  15. Advanced materials and techniques for fibre-optic sensing

    NASA Astrophysics Data System (ADS)

    Henderson, Philip J.

    2014-06-01

    Fibre-optic monitoring systems came of age in about 1999 upon the emergence of the world's first significant commercialising company - a spin-out from the UK's collaborative MAST project. By using embedded fibre-optic technology, the MAST project successfully measured transient strain within high-performance composite yacht masts. Since then, applications have extended from smart composites into civil engineering, energy, military, aerospace, medicine and other sectors. Fibre-optic sensors come in various forms, and may be subject to embedment, retrofitting, and remote interrogation. The unique challenges presented by each implementation require careful scrutiny before widespread adoption can take place. Accordingly, various aspects of design and reliability are discussed spanning a range of representative technologies that include resonant microsilicon structures, MEMS, Bragg gratings, advanced forms of spectroscopy, and modern trends in nanotechnology. Keywords: Fibre-optic sensors, fibre Bragg gratings, MEMS, MOEMS, nanotechnology, plasmon.

  16. Advanced Techniques for Constrained Internal Coordinate Molecular Dynamics

    PubMed Central

    Wagner, Jeffrey R.; Balaraman, Gouthaman S.; Niesen, Michiel J. M.; Larsen, Adrien B.; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-01-01

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle and torsional coordinates instead of a Cartesian coordinate representation. Freezing high frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed in order to make the CICMD method robust and widely usable. In this paper we have designed a new framework for 1) initializing velocities for non-independent CICMD coordinates, 2) efficient computation of center of mass velocity during CICMD simulations, 3) using advanced integrators such as Runge-Kutta, Lobatto and adaptive CVODE for CICMD simulations, and 4) cancelling out the “flying ice cube effect” that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this paper, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided “freezing and thawing” of degrees of freedom in the molecule on the fly during MD simulations, and is shown to fold four proteins to their native topologies. With these advancements we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion. PMID:23345138

  17. Advanced techniques for constrained internal coordinate molecular dynamics.

    PubMed

    Wagner, Jeffrey R; Balaraman, Gouthaman S; Niesen, Michiel J M; Larsen, Adrien B; Jain, Abhinandan; Vaidehi, Nagarajan

    2013-04-30

    Internal coordinate molecular dynamics (ICMD) methods provide a more natural description of a protein by using bond, angle, and torsional coordinates instead of a Cartesian coordinate representation. Freezing high-frequency bonds and angles in the ICMD model gives rise to constrained ICMD (CICMD) models. There are several theoretical aspects that need to be developed to make the CICMD method robust and widely usable. In this article, we have designed a new framework for (1) initializing velocities for nonindependent CICMD coordinates, (2) efficient computation of center of mass velocity during CICMD simulations, (3) using advanced integrators such as Runge-Kutta, Lobatto, and adaptive CVODE for CICMD simulations, and (4) cancelling out the "flying ice cube effect" that sometimes arises in Nosé-Hoover dynamics. The Generalized Newton-Euler Inverse Mass Operator (GNEIMO) method is an implementation of a CICMD method that we have developed to study protein dynamics. GNEIMO allows for a hierarchy of coarse-grained simulation models based on the ability to rigidly constrain any group of atoms. In this article, we perform tests on the Lobatto and Runge-Kutta integrators to determine optimal simulation parameters. We also implement an adaptive coarse-graining tool using the GNEIMO Python interface. This tool enables the secondary structure-guided "freezing and thawing" of degrees of freedom in the molecule on the fly during molecular dynamics simulations and is shown to fold four proteins to their native topologies. With these advancements, we envision the use of the GNEIMO method in protein structure prediction, structure refinement, and in studying domain motion.

  18. Performance evaluation of a hybrid-passive landfill leachate treatment system using multivariate statistical techniques

    SciTech Connect

    Wallace, Jack; Champagne, Pascale; Monnier, Anne-Charlotte

    2015-01-15

    Highlights: • Performance of a hybrid passive landfill leachate treatment system was evaluated. • 33 Water chemistry parameters were sampled for 21 months and statistically analyzed. • Parameters were strongly linked and explained most (>40%) of the variation in data. • Alkalinity, ammonia, COD, heavy metals, and iron were criteria for performance. • Eight other parameters were key in modeling system dynamics and criteria. - Abstract: A pilot-scale hybrid-passive treatment system operated at the Merrick Landfill in North Bay, Ontario, Canada, treats municipal landfill leachate and provides for subsequent natural attenuation. Collected leachate is directed to a hybrid-passive treatment system, followed by controlled release to a natural attenuation zone before entering the nearby Little Sturgeon River. The study presents a comprehensive evaluation of the performance of the system using multivariate statistical techniques to determine the interactions between parameters, major pollutants in the leachate, and the biological and chemical processes occurring in the system. Five parameters (ammonia, alkalinity, chemical oxygen demand (COD), “heavy” metals of interest, with atomic weights above calcium, and iron) were set as criteria for the evaluation of system performance based on their toxicity to aquatic ecosystems and importance in treatment with respect to discharge regulations. System data for a full range of water quality parameters over a 21-month period were analyzed using principal components analysis (PCA), as well as principal components (PC) and partial least squares (PLS) regressions. PCA indicated a high degree of association for most parameters with the first PC, which explained a high percentage (>40%) of the variation in the data, suggesting strong statistical relationships among most of the parameters in the system. Regression analyses identified 8 parameters (set as independent variables) that were most frequently retained for modeling

  19. Advances in statistical methods to map quantitative trait loci in outbred populations.

    PubMed

    Hoeschele, I; Uimari, P; Grignola, F E; Zhang, Q; Gage, K M

    1997-11-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown.

  20. Advances in Statistical Methods to Map Quantitative Trait Loci in Outbred Populations

    PubMed Central

    Hoeschele, I.; Uimari, P.; Grignola, F. E.; Zhang, Q.; Gage, K. M.

    1997-01-01

    Statistical methods to map quantitative trait loci (QTL) in outbred populations are reviewed, extensions and applications to human and plant genetic data are indicated, and areas for further research are identified. Simple and computationally inexpensive methods include (multiple) linear regression of phenotype on marker genotypes and regression of squared phenotypic differences among relative pairs on estimated proportions of identity-by-descent at a locus. These methods are less suited for genetic parameter estimation in outbred populations but allow the determination of test statistic distributions via simulation or data permutation; however, further inferences including confidence intervals of QTL location require the use of Monte Carlo or bootstrap sampling techniques. A method which is intermediate in computational requirements is residual maximum likelihood (REML) with a covariance matrix of random QTL effects conditional on information from multiple linked markers. Testing for the number of QTLs on a chromosome is difficult in a classical framework. The computationally most demanding methods are maximum likelihood and Bayesian analysis, which take account of the distribution of multilocus marker-QTL genotypes on a pedigree and permit investigators to fit different models of variation at the QTL. The Bayesian analysis includes the number of QTLs on a chromosome as an unknown. PMID:9383084

  1. Statistical assessment of soil surface roughness for environmental applications using photogrammetric imaging techniques

    NASA Astrophysics Data System (ADS)

    Marzahn, Philip; Rieke-Zapp, Dirk; Ludwig, Ralf

    2010-05-01

    Micro scale soil surface roughness is a crucial parameter in many environmental applications. Recent soil erosion studies have shown the impact of micro topography on soil erosion rates as well as overland flow generation due to soil crusting effects. Besides the above mentioned, it is widely recognized that the backscattered signal in SAR remote sensing is strongly influenced by soil surface roughness and by regular higher order tillage patterns. However, there is an ambiguity in the appropriate measurement technique and scale for roughness studies and SAR backscatter model parametrization. While different roughness indices depend on their measurement length, no satisfying roughness parametrization and measurement technique has been found yet, introducing large uncertainty in the interpretation of the radar backscatter. In the presented study, we computed high resolution digital elevation models (DEM) using a consumer grade digital camera in the frame of photogrammetric imaging techniques to represent soil micro topography from different soil surfaces (ploughed, harrowed, seedbed and crusted) . The retrieved DEMs showed sufficient accuracy, with an RMSE of a 1.64 mm compared to high accurate reference points,. For roughness characterization, we calculated different roughness indices (RMS height (s), autocorrelation length (l), tortuosity index (TB)). In an extensive statistical investigation we show the behaviour of the roughness indices for different acquisition sizes. Compared to results from profile measurements taken from literature and profiles generated out of the dataset, results indicate,that by using a three dimensional measuring device, the calculated roughness indices are more robust against outliers and even saturate faster with increasing acquisition size. Dependent on the roughness condition, the calculated values for the RMS-height saturate for ploughed fields at 2.3 m, for harrowed fields at 2.0 m and for crusted fields at 1.2 m. Results also

  2. Advances in parameter estimation techniques applied to flexible structures

    NASA Technical Reports Server (NTRS)

    Maben, Egbert; Zimmerman, David C.

    1994-01-01

    In this work, various parameter estimation techniques are investigated in the context of structural system identification utilizing distributed parameter models and 'measured' time-domain data. Distributed parameter models are formulated using the PDEMOD software developed by Taylor. Enhancements made to PDEMOD for this work include the following: (1) a Wittrick-Williams based root solving algorithm; (2) a time simulation capability; and (3) various parameter estimation algorithms. The parameter estimations schemes will be contrasted using the NASA Mini-Mast as the focus structure.

  3. Advances in dental veneers: materials, applications, and techniques

    PubMed Central

    Pini, Núbia Pavesi; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite; Lovadino, José Roberto; Terada, Raquel Sano Suga; Pascotto, Renata Corrêa

    2012-01-01

    Laminate veneers are a conservative treatment of unaesthetic anterior teeth. The continued development of dental ceramics offers clinicians many options for creating highly aesthetic and functional porcelain veneers. This evolution of materials, ceramics, and adhesive systems permits improvement of the aesthetic of the smile and the self-esteem of the patient. Clinicians should understand the latest ceramic materials in order to be able to recommend them and their applications and techniques, and to ensure the success of the clinical case. The current literature was reviewed to search for the most important parameters determining the long-term success, correct application, and clinical limitations of porcelain veneers. PMID:23674920

  4. Statistically Optimal Approximations of Astronomical Signals: Implications to Classification and Advanced Study of Variable Stars

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.; Chinarova, L. L.; Kudashkina, L. S.; Marsakova, V. I.; Tkachenko, M. G.

    2016-06-01

    We have elaborated a set of new algorithms and programs for advanced time series analysis of (generally) multi-component multi-channel observations with irregularly spaced times of observations, which is a common case for large photometric surveys. Previous self-review on these methods for periodogram, scalegram, wavelet, autocorrelation analysis as well as on "running" or "sub-interval" local approximations were self-reviewed in (2003ASPC..292..391A). For an approximation of the phase light curves of nearly-periodic pulsating stars, we use a Trigonometric Polynomial (TP) fit of the statistically optimal degree and initial period improvement using differential corrections (1994OAP.....7...49A). For the determination of parameters of "characteristic points" (minima, maxima, crossings of some constant value etc.) we use a set of methods self-reviewed in 2005ASPC..335...37A, Results of the analysis of the catalogs compiled using these programs are presented in 2014AASP....4....3A. For more complicated signals, we use "phenomenological approximations" with "special shapes" based on functions defined on sub-intervals rather on the complete interval. E. g. for the Algol-type stars we developed the NAV ("New Algol Variable") algorithm (2012Ap.....55..536A, 2012arXiv1212.6707A, 2015JASS...32..127A), which was compared to common methods of Trigonometric Polynomial Fit (TP) or local Algebraic Polynomial (A) fit of a fixed or (alternately) statistically optimal degree. The method allows determine the minimal set of parameters required for the "General Catalogue of Variable Stars", as well as an extended set of phenomenological and astrophysical parameters which may be used for the classification. Totally more that 1900 variable stars were studied in our group using these methods in a frame of the "Inter-Longitude Astronomy" campaign (2010OAP....23....8A) and the "Ukrainian Virtual Observatory" project (2012KPCB...28...85V).

  5. Advanced terahertz techniques for quality control and counterfeit detection

    NASA Astrophysics Data System (ADS)

    Ahi, Kiarash; Anwar, Mehdi

    2016-04-01

    This paper reports our invented methods for detection of counterfeit electronic. These versatile techniques are also handy in quality control applications. Terahertz pulsed laser systems are capable of giving the material characteristics and thus make it possible to distinguish between the materials used in authentic components and their counterfeit clones. Components with material defects can also be distinguished in section in this manner. In this work different refractive indices and absorption coefficients were observed for counterfeit components compared to their authentic counterparts. Existence of unexpected ingredient materials was detected in counterfeit components by Fourier Transform analysis of the transmitted terahertz pulse. Thicknesses of different layers are obtainable by analyzing the reflected terahertz pulse. Existence of unexpected layers is also detectable in this manner. Recycled, sanded and blacktopped counterfeit electronic components were detected as a result of these analyses. Counterfeit ICs with die dislocations were detected by depicting the terahertz raster scanning data in a coordinate plane which gives terahertz images. In the same manner, raster scanning of the reflected pulse gives terahertz images of the surfaces of the components which were used to investigate contaminant materials and sanded points on the surfaces. The results of the later technique, reveals the recycled counterfeit components.

  6. Advanced techniques in reliability model representation and solution

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Nicol, David M.

    1992-01-01

    The current tendency of flight control system designs is towards increased integration of applications and increased distribution of computational elements. The reliability analysis of such systems is difficult because subsystem interactions are increasingly interdependent. Researchers at NASA Langley Research Center have been working for several years to extend the capability of Markov modeling techniques to address these problems. This effort has been focused in the areas of increased model abstraction and increased computational capability. The reliability model generator (RMG) is a software tool that uses as input a graphical object-oriented block diagram of the system. RMG uses a failure-effects algorithm to produce the reliability model from the graphical description. The ASSURE software tool is a parallel processing program that uses the semi-Markov unreliability range evaluator (SURE) solution technique and the abstract semi-Markov specification interface to the SURE tool (ASSIST) modeling language. A failure modes-effects simulation is used by ASSURE. These tools were used to analyze a significant portion of a complex flight control system. The successful combination of the power of graphical representation, automated model generation, and parallel computation leads to the conclusion that distributed fault-tolerant system architectures can now be analyzed.

  7. Comparison of three advanced chromatographic techniques for cannabis identification.

    PubMed

    Debruyne, D; Albessard, F; Bigot, M C; Moulin, M

    1994-01-01

    The development of chromatography technology, with the increasing availability of easier-to-use mass spectrometers combined with gas chromatography (GC), the use of diode-array or programmable variable-wavelength ultraviolet absorption detectors in conjunction with high-performance liquid chromatography (HPLC), and the availability of scanners capable of reading thin-layer chromatography (TLC) plates in the ultraviolet and visible regions, has made for easier, quicker and more positive identification of cannabis samples that standard analytical laboratories are occasionally required to undertake in the effort to combat drug addiction. At laboratories that do not possess the technique of GC combined with mass spectrometry, which provides an irrefutable identification, the following procedure involving HPLC or TLC techniques may be used: identification of the chromatographic peaks corresponding to each of the three main cannabis constituents-cannabidiol (CBD), delta-9-tetrahydrocannabinol (delta-9-THC) and cannabinol (CBN)-by comparison with published data in conjunction with a specific absorption spectrum for each of those constituents obtained between 200 and 300 nm. The collection of the fractions corresponding to the three major cannabinoids at the HPLC system outlet and the cross-checking of their identity in the GC process with flame ionization detection can further corroborate the identification and minimize possible errors due to interference.

  8. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  9. Coal and Coal Constituent Studies by Advanced EMR Techniques

    SciTech Connect

    Alex I. Smirnov; Mark J. Nilges; R. Linn Belford; Robert B. Clarkson

    1998-03-31

    Advanced electronic magnetic resonance (EMR) methods are used to examine properties of coals, chars, and molecular species related to constituents of coal. We have achieved substantial progress on upgrading the high field (HF) EMR (W-band, 95 GHz) spectrometers that are especially advantageous for such studies. Particularly, we have built a new second W-band instrument (Mark II) in addition to our Mark I. Briefly, Mark II features: (i) an Oxford custom-built 7 T superconducting magnet which is scannable from 0 to 7 T at up to 0.5 T/min; (ii) water-cooled coaxial solenoid with up to ±550 G scan under digital (15 bits resolution) computer control; (iii) custom-engineered precision feed-back circuit, which is used to drive this solenoid, is based on an Ultrastab 860R sensor that has linearity better than 5 ppm and resolution of 0.05 ppm; (iv) an Oxford CF 1200 cryostat for variable temperature studies from 1.8 to 340 K. During this grant period we have completed several key upgrades of both Mark I and II, particularly microwave bridge, W-band probehead, and computer interfaces. We utilize these improved instruments for HF EMR studies of spin-spin interaction and existence of different paramagnetic species in carbonaceous solids.

  10. Evaluating machine learning and statistical prediction techniques for landslide susceptibility modeling

    NASA Astrophysics Data System (ADS)

    Goetz, J. N.; Brenning, A.; Petschko, H.; Leopold, P.

    2015-08-01

    Statistical and now machine learning prediction methods have been gaining popularity in the field of landslide susceptibility modeling. Particularly, these data driven approaches show promise when tackling the challenge of mapping landslide prone areas for large regions, which may not have sufficient geotechnical data to conduct physically-based methods. Currently, there is no best method for empirical susceptibility modeling. Therefore, this study presents a comparison of traditional statistical and novel machine learning models applied for regional scale landslide susceptibility modeling. These methods were evaluated by spatial k-fold cross-validation estimation of the predictive performance, assessment of variable importance for gaining insights into model behavior and by the appearance of the prediction (i.e. susceptibility) map. The modeling techniques applied were logistic regression (GLM), generalized additive models (GAM), weights of evidence (WOE), the support vector machine (SVM), random forest classification (RF), and bootstrap aggregated classification trees (bundling) with penalized discriminant analysis (BPLDA). These modeling methods were tested for three areas in the province of Lower Austria, Austria. The areas are characterized by different geological and morphological settings. Random forest and bundling classification techniques had the overall best predictive performances. However, the performances of all modeling techniques were for the majority not significantly different from each other; depending on the areas of interest, the overall median estimated area under the receiver operating characteristic curve (AUROC) differences ranged from 2.9 to 8.9 percentage points. The overall median estimated true positive rate (TPR) measured at a 10% false positive rate (FPR) differences ranged from 11 to 15pp. The relative importance of each predictor was generally different between the modeling methods. However, slope angle, surface roughness and plan

  11. Preparing High School Students for Success in Advanced Placement Statistics: An Investigation of Pedagogies and Strategies Used in an Online Advanced Placement Statistics Course

    ERIC Educational Resources Information Center

    Potter, James Thomson, III

    2012-01-01

    Research into teaching practices and strategies has been performed separately in AP Statistics and in K-12 online learning (Garfield, 2002; Ferdig, DiPietro, Black & Dawson, 2009). This study seeks combine the two and build on the need for more investigation into online teaching and learning in specific content (Ferdig et al, 2009; DiPietro,…

  12. Bioactive glass thin films synthesized by advanced pulsed laser techniques

    NASA Astrophysics Data System (ADS)

    Mihailescu, N.; Stan, George E.; Ristoscu, C.; Sopronyi, M.; Mihailescu, Ion N.

    2016-10-01

    Bioactive materials play an increasingly important role in the biomaterials industry, and are extensively used in a range of applications, including biodegradable metallic implants. We report on Bioactive Glasses (BG) films deposition by pulsed laser techniques onto biodegradable substrates. The BG coatings were obtained using a KrF* excimer laser source (λ= 248 nm, τFWHM ≤ 25 ns).Their thickness has been determined by Profilometry measurements, whilst their morphology has been analysed by Scanning Electron Microscopy (SEM). The obtained coatings fairly preserved the targets composition and structure, as revealed by Energy Dispersive X-Ray Spectroscopy, Grazing Incidence X-Ray Diffraction, and Fourier Transform Infra-Red Spectroscopy analyses.

  13. Advanced Techniques in Musculoskeletal Oncology: Perfusion, Diffusion, and Spectroscopy.

    PubMed

    Teixeira, Pedro A Gondim; Beaumont, Marine; Gabriela, Hossu; Bailiang, Chen; Verhaeghe, Jean-luc; Sirveaux, François; Blum, Alain

    2015-12-01

    The imaging characterization of musculoskeletal tumors can be challenging, and a significant number of lesions remain indeterminate when conventional imaging protocols are used. In recent years, clinical availability of functional imaging methods has increased. Functional imaging has the potential to improve tumor detection, characterization, and follow-up. The most frequently used functional methods are perfusion imaging, diffusion-weighted imaging (DWI), and MR proton spectroscopy (MRS). Each of these techniques has specific protocol requirements and diagnostic pitfalls that need to be acknowledged to avoid misdiagnoses. Additionally, the application of functional methods in the MSK system has various technical issues that need to be addressed to ensure data quality and comparability. In this article, the application of contrast-enhanced perfusion imaging, DWI, and MRS for the evaluation of bone and soft tissue tumors is discussed, with emphasis on acquisition protocols, technical difficulties, and current clinical indications.

  14. Advanced fabrication techniques for hydrogen-cooled engine structures

    NASA Technical Reports Server (NTRS)

    Buchmann, O. A.; Arefian, V. V.; Warren, H. A.; Vuigner, A. A.; Pohlman, M. J.

    1985-01-01

    Described is a program for development of coolant passage geometries, material systems, and joining processes that will produce long-life hydrogen-cooled structures for scramjet applications. Tests were performed to establish basic material properties, and samples constructed and evaluated to substantiate fabrication processes and inspection techniques. Results of the study show that the basic goal of increasing the life of hydrogen-cooled structures two orders of magnitude relative to that of the Hypersonic Research Engine can be reached with available means. Estimated life is 19000 cycles for the channels and 16000 cycles for pin-fin coolant passage configurations using Nickel 201. Additional research is required to establish the fatigue characteristics of dissimilar-metal coolant passages (Nickel 201/Inconel 718) and to investigate the embrittling effects of the hydrogen coolant.

  15. Advances in techniques for assessment of microalgal lipids.

    PubMed

    Challagulla, Vineela; Nayar, Sasi; Walsh, Kerry; Fabbro, Larelle

    2016-07-15

    Microalgae are a varied group of organisms with considerable commercial potential as sources of various biochemicals, storage molecules and metabolites such as lipids, sugars, amino acids, pigments and toxins. Algal lipids can be processed to bio-oils and biodiesel. The conventional method to estimate algal lipids is based on extraction using solvents and quantification by gravimetry or chromatography. Such methods are time consuming, use hazardous chemicals and are labor intensive. For rapid screening of prospective algae or for management decisions (e.g. decision on timing of harvest), a rapid, high throughput, reliable, accurate, cost effective and preferably nondestructive analytical technique is desirable. This manuscript reviews the application of fluorescent lipid soluble dyes (Nile Red and BODIPY 505/515), nuclear magnetic resonance (NMR), Raman, Fourier transform infrared (FTIR) and near infrared (NIR) spectroscopy for the assessment of lipids in microalgae.

  16. Assessing the performance of dynamical and statistical downscaling techniques to simulate crop yield in West Africa

    NASA Astrophysics Data System (ADS)

    Sultan, B.; Oettli, P.; Vrac, M.; Baron, C.

    2010-12-01

    Global circulation models (GCM) are increasingly capable of making relevant predictions of seasonal and long-term climate variability, thus improving prospects of predicting impact on crop yields. This is particularly important for semi-arid West Africa where climate variability and drought threaten food security. Translating GCM outputs into attainable crop yields is difficult because GCM grid boxes are of larger scale than the processes governing yield, involving partitioning of rain among runoff, evaporation, transpiration, drainage and storage at plot scale. It therefore requires the use of downscaling methods. This study analyzes the performance of both dynamical and statistical downscaling techniques in simulating crop yield at local scale. A detailed case study is conducted using historical weather data for Senegal, applied to the crop model SARRAH for simulating several tropical cereals (sorghum, millet, maize) at local scale. This control simulation is used as a benchmark to evaluate a set of Regional Climate Models (RCM) simulations, forced by ERA-Interim, from the ENSEMBLES project and a statistical downscaling method, the CDF-Transform, used to correct biases in RCM outputs. We first evaluate each climate variable that drives the simulated yield in the control simulation (radiation, rainfall, temperatures). We then simulate crop yields with RCM outputs (with or without applying the CDG-Transform) and evaluate the performance of each RCM in regards to crop yield simulations.

  17. Comparative evaluation of pattern recognition algorithms: statistical, neural, fuzzy, and neuro-fuzzy techniques

    NASA Astrophysics Data System (ADS)

    Mitra, Sunanda; Castellanos, Ramiro

    1998-10-01

    Pattern recognition by fuzzy, neural, and neuro-fuzzy approaches, has gained popularity partly because of intelligent decision processes involved in some of the above techniques, thus providing better classification and partly because of simplicity in computation required by these methods as opposed to traditional statistical approaches for complex data structures. However, the accuracy of pattern classification by various methods is often not considered. This paper considers the performance of major fuzzy, neural, and neuro-fuzzy pattern recognition algorithms and compares their performances with common statistical methods for the same data sets. For the specific data sets chosen namely the Iris data set, an the small Soybean data set, two neuro-fuzzy algorithms, AFLC and IAFC, outperform other well- known fuzzy, neural, and neuro-fuzzy algorithms in minimizing the classification error and equal the performance of the Bayesian classification. AFLC, and IAFC also demonstrate excellent learning vector quantization capability in generating optimal code books for coding and decoding of large color images at very low bit rates with exceptionally high visual fidelity.

  18. An Analysis of Research Methods and Statistical Techniques Used by Doctoral Dissertation at the Education Sciences in Turkey

    ERIC Educational Resources Information Center

    Karadag, Engin

    2010-01-01

    To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…

  19. Identification of fungal phytopathogens using Fourier transform infrared-attenuated total reflection spectroscopy and advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Salman, Ahmad; Lapidot, Itshak; Pomerantz, Ami; Tsror, Leah; Shufan, Elad; Moreh, Raymond; Mordechai, Shaul; Huleihel, Mahmoud

    2012-01-01

    The early diagnosis of phytopathogens is of a great importance; it could save large economical losses due to crops damaged by fungal diseases, and prevent unnecessary soil fumigation or the use of fungicides and bactericides and thus prevent considerable environmental pollution. In this study, 18 isolates of three different fungi genera were investigated; six isolates of Colletotrichum coccodes, six isolates of Verticillium dahliae and six isolates of Fusarium oxysporum. Our main goal was to differentiate these fungi samples on the level of isolates, based on their infrared absorption spectra obtained using the Fourier transform infrared-attenuated total reflection (FTIR-ATR) sampling technique. Advanced statistical and mathematical methods: principal component analysis (PCA), linear discriminant analysis (LDA), and k-means were applied to the spectra after manipulation. Our results showed significant spectral differences between the various fungi genera examined. The use of k-means enabled classification between the genera with a 94.5% accuracy, whereas the use of PCA [3 principal components (PCs)] and LDA has achieved a 99.7% success rate. However, on the level of isolates, the best differentiation results were obtained using PCA (9 PCs) and LDA for the lower wavenumber region (800-1775 cm-1), with identification success rates of 87%, 85.5%, and 94.5% for Colletotrichum, Fusarium, and Verticillium strains, respectively.

  20. Advanced Infusion Techniques with 3-D Printed Tooling

    SciTech Connect

    Nuttall, David; Elliott, Amy; Post, Brian K.; Love, Lonnie J.

    2016-05-10

    The manufacturing of tooling for large, contoured surfaces for fiber-layup applications requires significant effort to understand the geometry and then to subtractively manufacture the tool. Traditional methods for the auto industry use clay that is hand sculpted. In the marine pleasure craft industry, the exterior of the model is formed from a foam lay-up that is either hand cut or machined to create smooth lines. Engineers and researchers at Oak Ridge National Laboratory s Manufacturing Demonstration Facility (ORNL MDF) collaborated with Magnum Venus Products (MVP) in the development of a process for reproducing legacy whitewater adventure craft via digital scanning and large scale 3-D printed layup molds. The process entailed 3D scanning a legacy canoe form, converting that form to a CAD model, additively manufacturing (3-D Print) the mold tool, and subtractively finishing the mold s transfer surfaces. Future work will include applying a gelcoat to the mold transfer surface and infusing using vacuum assisted resin transfer molding, or VARTM principles, to create a watertight vessel. The outlined steps were performed on a specific canoe geometry found by MVP s principal participant. The intent of utilizing this geometry is to develop an energy efficient and marketable process for replicating complex shapes, specifically focusing on this particular watercraft, and provide a finished product for demonstration to the composites industry. The culminating part produced through this agreement has been slated for public presentation and potential demonstration at the 2016 CAMX (Composites and Advanced Materials eXpo) exposition in Anaheim, CA. Phase I of this collaborative research and development agreement (MDF-15-68) was conducted under CRADA NFE-15-05575 and was initiated on May 7, 2015, with an introduction to the MVP product line, and concluded in March of 2016 with the printing of and processing of a canoe mold. The project partner Magnum Venous Products (MVP) is

  1. Indonesia: statistical sampling technique in validation of radiation sterilisation dose of biological tissue.

    PubMed

    Hilmy, N; Basril, A; Febrida, A

    2003-01-01

    The aim of the work is to find the best solution for statistical sampling technique in validation of radiation sterilization dose (RSD) for biological tissues, according to ISO standard. As a model for sampling are biological tissues retrieved from one cadaver donor which consist of frozen bone grafts (18 packets), lyophilized allografts (68 packets) and demineralized bone powder grafts (40 packets). The size and type of products vary from long bones, cancellous chips to bone powders, tendons and facia lata, that make the number of bioburden per product could not be treated equally. Frozen samples could not be considered as the same production batch with lyophilized samples, because of different processing and irradiation temperature. The minimum number of uniformed samples needed for validation per production batch size, according to ISO 13409, is from 20 to 79 and 20 of them will be used for the test sample size, i.e. 10 for bio-burden determination and the remaining 10 for verification dose. Based on the number of uniformed grafts, statistical sampling can be carried out on lyophilized and demineralized bone grafts, but not on frozen bone grafts. Bioburden determinations were carried out and validated according to ISO 11737-1. Results of average bioburden determination (cfu/per packet), using sample item portion (SIP) = 1, are 5 cfu/packet for lyophilized bone grafts and 4 cfu/packet for demineralized bone powder grafts. Verification doses obtained were 2.40 kGy for lyophilized grafts and 2.24 kGy for demineralized bone powder grafts. The results of verification dose were accepted and the RSD of 25 kGy is substantiated It can be concluded that a statistical sampling technique can be applied if all the grafts produced in the same process such as lyophilized, demineralized as well as frozen are assumed to be in one production batch regardless of sample uniformity such as size, type and weight; for this ISO 13409 can be applied for the validation of RSD.

  2. Recent advances in techniques for tsetse-fly control*

    PubMed Central

    MacLennan, K. J. R.

    1967-01-01

    With the advent of modern persistent insecticides, it has become possible to utilize some of the knowledge that has accumulated on the ecology and bionomics of Glossina and to devise more effective techniques for the control and eventual extermination of these species. The present article, based on experience of the tsetse fly problem in Northern Nigeria, points out that the disadvantages of control techniques—heavy expenditure of money and manpower and undue damage to the biosystem—can now largely be overcome by basing the application of insecticides on knowledge of the habits of the particular species of Glossina in a particular environment. Two factors are essential to the success of a control project: the proper selection of sites for spraying (the concept of restricted application) and the degree of persistence of the insecticide used. Reinfestation from within or outside the project area must also be taken into account. These and other aspects are discussed in relation to experience gained from a successful extermination project carried out in the Sudan vegetation zone and from present control activities in the Northern Guinea vegetation zone. PMID:5301739

  3. Advances in Current Rating Techniques for Flexible Printed Circuits

    NASA Technical Reports Server (NTRS)

    Hayes, Ron

    2014-01-01

    Twist Capsule Assemblies are power transfer devices commonly used in spacecraft mechanisms that require electrical signals to be passed across a rotating interface. Flexible printed circuits (flex tapes, see Figure 2) are used to carry the electrical signals in these devices. Determining the current rating for a given trace (conductor) size can be challenging. Because of the thermal conditions present in this environment the most appropriate approach is to assume that the only means by which heat is removed from the trace is thru the conductor itself, so that when the flex tape is long the temperature rise in the trace can be extreme. While this technique represents a worst-case thermal situation that yields conservative current ratings, this conservatism may lead to overly cautious designs when not all traces are used at their full rated capacity. A better understanding of how individual traces behave when they are not all in use is the goal of this research. In the testing done in support of this paper, a representative flex tape used for a flight Solar Array Drive Assembly (SADA) application was tested by energizing individual traces (conductors in the tape) in a vacuum chamber and the temperatures of the tape measured using both fine-gauge thermocouples and infrared thermographic imaging. We find that traditional derating schemes used for bundles of wires do not apply for the configuration tested. We also determine that single active traces located in the center of a flex tape operate at lower temperatures than those on the outside edges.

  4. Advanced Manufacturing Techniques Demonstrated for Fabricating Developmental Hardware

    NASA Technical Reports Server (NTRS)

    Redding, Chip

    2004-01-01

    NASA Glenn Research Center's Engineering Development Division has been working in support of innovative gas turbine engine systems under development by Glenn's Combustion Branch. These one-of-a-kind components require operation under extreme conditions. High-temperature ceramics were chosen for fabrication was because of the hostile operating environment. During the designing process, it became apparent that traditional machining techniques would not be adequate to produce the small, intricate features for the conceptual design, which was to be produced by stacking over a dozen thin layers with many small features that would then be aligned and bonded together into a one-piece unit. Instead of using traditional machining, we produced computer models in Pro/ENGINEER (Parametric Technology Corporation (PTC), Needham, MA) to the specifications of the research engineer. The computer models were exported in stereolithography standard (STL) format and used to produce full-size rapid prototype polymer models. These semi-opaque plastic models were used for visualization and design verification. The computer models also were exported in International Graphics Exchange Specification (IGES) format and sent to Glenn's Thermal/Fluids Design & Analysis Branch and Applied Structural Mechanics Branch for profiling heat transfer and mechanical strength analysis.

  5. Simulation of an advanced techniques of ion propulsion Rocket system

    NASA Astrophysics Data System (ADS)

    Bakkiyaraj, R.

    2016-07-01

    The ion propulsion rocket system is expected to become popular with the development of Deuterium,Argon gas and Hexagonal shape Magneto hydrodynamic(MHD) techniques because of the stimulation indirectly generated the power from ionization chamber,design of thrust range is 1.2 N with 40 KW of electric power and high efficiency.The proposed work is the study of MHD power generation through ionization level of Deuterium gas and combination of two gaseous ions(Deuterium gas ions + Argon gas ions) at acceleration stage.IPR consists of three parts 1.Hexagonal shape MHD based power generator through ionization chamber 2.ion accelerator 3.Exhaust of Nozzle.Initially the required energy around 1312 KJ/mol is carrying out the purpose of deuterium gas which is changed to ionization level.The ionized Deuterium gas comes out from RF ionization chamber to nozzle through MHD generator with enhanced velocity then after voltage is generated across the two pairs of electrode in MHD.it will produce thrust value with the help of mixing of Deuterium ion and Argon ion at acceleration position.The simulation of the IPR system has been carried out by MATLAB.By comparing the simulation results with the theoretical and previous results,if reaches that the proposed method is achieved of thrust value with 40KW power for simulating the IPR system.

  6. Advances in low energy neutral atom imaging techniques

    SciTech Connect

    Scime, E.E.; Funsten, H.O.; McComas, D.J.; Moore, K.R. ); Gruntman, M. . Space Sciences Center)

    1993-01-01

    Recently proposed low energy neutral atom (LENA) imaging techniques use a collisional process to convert the low energy neutrals into ions before detection. At low energies, collisional processes limit the angular resolution and conversion efficiencies of these devices. However, if the intense ultraviolet light background can be suppressed, direct LENA detection is possible. We present results from a series of experiments designed to develop a novel filtering structure based on free-standing transmission gratings. If the grating period is sufficiently small, free standing transmission gratings can be employed to substantially polarize ultraviolet (UV) light in the wavelength range 300 [Angstrom] to 1500 [Angstrom]. If a second grating is placed behind the first grating with its axis of polarization oriented at a right angle to the first's, a substantial attenuation of UV radiation is achievable. ne neutrals will pass through the remaining open area of two gratings and be detected without UV background complications. We have obtained nominal 2000 [Angstrom] period (1000 [Angstrom] bars with 1000 [Angstrom] slits) free standing, gold transmission gratings and measured their UV and atomic transmission characteristics. The geometric factor of a LENA imager based on this technology is comparable to that of other proposed LENA imagers. In addition, this of imager does not distort the neutral trajectories, allowing for high angular resolution.

  7. Advanced signal processing technique for damage detection in steel tubes

    NASA Astrophysics Data System (ADS)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  8. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  9. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss "small-group apprenticeships (SGAs)" as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments…

  10. Advanced Techniques for Reservoir Simulation and Modeling of Non-Conventional Wells

    SciTech Connect

    Durlofsky, Louis J.

    2000-08-28

    This project targets the development of (1) advanced reservoir simulation techniques for modeling non-conventional wells; (2) improved techniques for computing well productivity (for use in reservoir engineering calculations) and well index (for use in simulation models), including the effects of wellbore flow; and (3) accurate approaches to account for heterogeneity in the near-well region.

  11. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research.

    ERIC Educational Resources Information Center

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    Discusses small-group apprenticeships (SGAs) as a method for introducing cell culture techniques to high school participants. Teaches cell culture practices and introduces advance imaging techniques to solve various biomedical engineering problems. Clarifies and illuminates the value of small-group laboratory apprenticeships. (Author/KHR)

  12. The updated statistical inversion technique to the evaluation of Umkehr observations

    NASA Technical Reports Server (NTRS)

    Frolov, Alexander D.; Obrazcov, Sergey P.

    1994-01-01

    In the present study the standard retrieval Umkehr method to estimate the vertical distribution of ozone was updated using a statistical approach to the mathematical inversion scheme. The vertical ozone profile covariance matrix was used as a priori information for the inverse problem. A new method of the ozonesonde data organization according to air mass types helped to improve the covariance matrix quality. A retrieval method was developed using eigenvector technique. An optimal vertical ozone profile resolution was determined from the mathematical inversion scheme analysis based on the same technique. The sun radiation transfer was accounted for multiple scattering and atmospheric sphericity in this calculation. The retrievals using actual Umkehr Dobson spectrophotometer observations were also performed to provide the comparison of the standard and updated methods with concurrent ozone sound data at Boulder U.S. The comparison has revealed that the present method has some advantages in both resolution and accuracy, as compared to the standard one, especially for the atmospheric layers below ozone maximum.

  13. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs

    PubMed Central

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2015-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012. PMID:26347393

  14. On Advanced Estimation Techniques for Exoplanet Detection and Characterization Using Ground-based Coronagraphs.

    PubMed

    Lawson, Peter R; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  15. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-Based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter R.; Frazin, Richard; Barrett, Harrison; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gladysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jerome; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Perrin, Marshall; Poyneer, Lisa; Pueyo, Laurent; Savransky, Dmitry; Soummer, Remi

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We provide a formal comparison of techniques through a blind data challenge and evaluate performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  16. On Advanced Estimation Techniques for Exoplanet Detection and Characterization using Ground-based Coronagraphs

    NASA Technical Reports Server (NTRS)

    Lawson, Peter; Frazin, Richard

    2012-01-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012

  17. On advanced estimation techniques for exoplanet detection and characterization using ground-based coronagraphs

    NASA Astrophysics Data System (ADS)

    Lawson, Peter R.; Poyneer, Lisa; Barrett, Harrison; Frazin, Richard; Caucci, Luca; Devaney, Nicholas; Furenlid, Lars; Gładysz, Szymon; Guyon, Olivier; Krist, John; Maire, Jérôme; Marois, Christian; Mawet, Dimitri; Mouillet, David; Mugnier, Laurent; Pearson, Iain; Perrin, Marshall; Pueyo, Laurent; Savransky, Dmitry

    2012-07-01

    The direct imaging of planets around nearby stars is exceedingly difficult. Only about 14 exoplanets have been imaged to date that have masses less than 13 times that of Jupiter. The next generation of planet-finding coronagraphs, including VLT-SPHERE, the Gemini Planet Imager, Palomar P1640, and Subaru HiCIAO have predicted contrast performance of roughly a thousand times less than would be needed to detect Earth-like planets. In this paper we review the state of the art in exoplanet imaging, most notably the method of Locally Optimized Combination of Images (LOCI), and we investigate the potential of improving the detectability of faint exoplanets through the use of advanced statistical methods based on the concepts of the ideal observer and the Hotelling observer. We propose a formal comparison of techniques using a blind data challenge with an evaluation of performance using the Receiver Operating Characteristic (ROC) and Localization ROC (LROC) curves. We place particular emphasis on the understanding and modeling of realistic sources of measurement noise in ground-based AO-corrected coronagraphs. The work reported in this paper is the result of interactions between the co-authors during a week-long workshop on exoplanet imaging that was held in Squaw Valley, California, in March of 2012.

  18. Multivariate mixed linear model analysis of longitudinal data: an information-rich statistical technique for analyzing plant disease resistance.

    PubMed

    Veturi, Yogasudha; Kump, Kristen; Walsh, Ellie; Ott, Oliver; Poland, Jesse; Kolkman, Judith M; Balint-Kurti, Peter J; Holland, James B; Wisser, Randall J

    2012-11-01

    ABSTRACT The mixed linear model (MLM) is an advanced statistical technique applicable to many fields of science. The multivariate MLM can be used to model longitudinal data, such as repeated ratings of disease resistance taken across time. In this study, using an example data set from a multi-environment trial of northern leaf blight disease on 290 maize lines with diverse levels of resistance, multivariate MLM analysis was performed and its utility was examined. In the population and environments tested, genotypic effects were highly correlated across disease ratings and followed an autoregressive pattern of correlation decay. Because longitudinal data are often converted to the univariate measure of area under the disease progress curve (AUDPC), comparisons between univariate MLM analysis of AUDPC and multivariate MLM analysis of longitudinal data were made. Univariate analysis had the advantage of simplicity and reduced computational demand, whereas multivariate analysis enabled a comprehensive perspective on disease development, providing the opportunity for unique insights into disease resistance. To aid in the application of multivariate MLM analysis of longitudinal data on disease resistance, annotated program syntax for model fitting is provided for the software ASReml.

  19. Coping, Stress, and Job Satisfaction as Predictors of Advanced Placement Statistics Teachers' Intention to Leave the Field

    ERIC Educational Resources Information Center

    McCarthy, Christopher J.; Lambert, Richard G.; Crowe, Elizabeth W.; McCarthy, Colleen J.

    2010-01-01

    This study examined the relationship of teachers' perceptions of coping resources and demands to job satisfaction factors. Participants were 158 Advanced Placement Statistics high school teachers who completed measures of personal resources for stress prevention, classroom demands and resources, job satisfaction, and intention to leave the field…

  20. "I am Not a Statistic": Identities of African American Males in Advanced Science Courses

    NASA Astrophysics Data System (ADS)

    Johnson, Diane Wynn

    The United States Bureau of Labor Statistics (2010) expects new industries to generate approximately 2.7 million jobs in science and technology by the year 2018, and there is concern as to whether there will be enough trained individuals to fill these positions. A tremendous resource remains untapped, African American students, especially African American males (National Science Foundation, 2009). Historically, African American males have been omitted from the so called science pipeline. Fewer African American males pursue a science discipline due, in part; to limiting factors they experience in school and at home (Ogbu, 2004). This is a case study of African American males who are enrolled in advanced science courses at a predominantly African American (84%) urban high school. Guided by expectancy-value theory (EVT) of achievement related results (Eccles, 2009; Eccles et al., 1983), twelve African American male students in two advanced science courses were observed in their science classrooms weekly, participated in an in-depth interview, developed a presentation to share with students enrolled in a tenth grade science course, responded to an open-ended identity questionnaire, and were surveyed about their perceptions of school. Additionally, the students' teachers were interviewed, and seven of the students' parents. The interview data analyses highlighted the important role of supportive parents (key socializers) who had high expectations for their sons and who pushed them academically. The students clearly attributed their enrollment in advanced science courses to their high regard for their science teachers, which included positive relationships, hands-on learning in class, and an inviting and encouraging learning environment. Additionally, other family members and coaches played important roles in these young men's lives. Students' PowerPoint(c) presentations to younger high school students on why they should take advanced science courses highlighted these

  1. Source of statistical noises in the Monte Carlo sampling techniques for coherently scattered photons.

    PubMed

    Muhammad, Wazir; Lee, Sang Hoon

    2013-01-01

    Detailed comparisons of the predictions of the Relativistic Form Factors (RFFs) and Modified Form Factors (MFFs) and their advantages and shortcomings in calculating elastic scattering cross sections can be found in the literature. However, the issues related to their implementation in the Monte Carlo (MC) sampling for coherently scattered photons is still under discussion. Secondly, the linear interpolation technique (LIT) is a popular method to draw the integrated values of squared RFFs/MFFs (i.e. A(Z, v(i)²)) over squared momentum transfer (v(i)² = v(1)²,......, v(59)²). In the current study, the role/issues of RFFs/MFFs and LIT in the MC sampling for the coherent scattering were analyzed. The results showed that the relative probability density curves sampled on the basis of MFFs are unable to reveal any extra scientific information as both the RFFs and MFFs produced the same MC sampled curves. Furthermore, no relationship was established between the multiple small peaks and irregular step shapes (i.e. statistical noise) in the PDFs and either RFFs or MFFs. In fact, the noise in the PDFs appeared due to the use of LIT. The density of the noise depends upon the interval length between two consecutive points in the input data table of A(Z, v(i)²) and has no scientific background. The probability density function curves became smoother as the interval lengths were decreased. In conclusion, these statistical noises can be efficiently removed by introducing more data points in the A(Z, v(i)²) data tables.

  2. Analyzing Planck and low redshift data sets with advanced statistical methods

    NASA Astrophysics Data System (ADS)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  3. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  4. Statistical techniques for modeling extreme price dynamics in the energy market

    NASA Astrophysics Data System (ADS)

    Mbugua, L. N.; Mwita, P. N.

    2013-02-01

    Extreme events have large impact throughout the span of engineering, science and economics. This is because extreme events often lead to failure and losses due to the nature unobservable of extra ordinary occurrences. In this context this paper focuses on appropriate statistical methods relating to a combination of quantile regression approach and extreme value theory to model the excesses. This plays a vital role in risk management. Locally, nonparametric quantile regression is used, a method that is flexible and best suited when one knows little about the functional forms of the object being estimated. The conditions are derived in order to estimate the extreme value distribution function. The threshold model of extreme values is used to circumvent the lack of adequate observation problem at the tail of the distribution function. The application of a selection of these techniques is demonstrated on the volatile fuel market. The results indicate that the method used can extract maximum possible reliable information from the data. The key attraction of this method is that it offers a set of ready made approaches to the most difficult problem of risk modeling.

  5. Modelling and analysing track cycling Omnium performances using statistical and machine learning techniques.

    PubMed

    Ofoghi, Bahadorreza; Zeleznikow, John; Dwyer, Dan; Macmahon, Clare

    2013-01-01

    This article describes the utilisation of an unsupervised machine learning technique and statistical approaches (e.g., the Kolmogorov-Smirnov test) that assist cycling experts in the crucial decision-making processes for athlete selection, training, and strategic planning in the track cycling Omnium. The Omnium is a multi-event competition that will be included in the summer Olympic Games for the first time in 2012. Presently, selectors and cycling coaches make decisions based on experience and intuition. They rarely have access to objective data. We analysed both the old five-event (first raced internationally in 2007) and new six-event (first raced internationally in 2011) Omniums and found that the addition of the elimination race component to the Omnium has, contrary to expectations, not favoured track endurance riders. We analysed the Omnium data and also determined the inter-relationships between different individual events as well as between those events and the final standings of riders. In further analysis, we found that there is no maximum ranking (poorest performance) in each individual event that riders can afford whilst still winning a medal. We also found the required times for riders to finish the timed components that are necessary for medal winning. The results of this study consider the scoring system of the Omnium and inform decision-making toward successful participation in future major Omnium competitions.

  6. Modulation/demodulation techniques for satellite communications. Part 2: Advanced techniques. The linear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory is presented for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the linear satellite channel. The underlying principle used is the development of receiver structures based on the maximum-likelihood decision rule. The application of the performance prediction tools, e.g., channel cutoff rate and bit error probability transfer function bounds to these modulation/demodulation techniques.

  7. Advanced combustion techniques for controlling NO sub x emissions of high altitude cruise aircraft

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.; Reck, G. M.

    1976-01-01

    An array of experiments designed to explore the potential of advanced combustion techniques for controlling the emissions of aircraft into the upper atmosphere was discussed. Of particular concern are the oxides of nitrogen (NOx) emissions into the stratosphere. The experiments utilize a wide variety of approaches varying from advanced combustor concepts to fundamental flame tube experiments. Results are presented which indicate that substantial reductions in cruise NOx emissions should be achievable in future aircraft engines. A major NASA program is described which focuses the many fundamental experiments into a planned evolution and demonstration of the prevaporized-premixed combustion technique in a full-scale engine.

  8. POC-Scale Testing of an Advanced Fine Coal Dewatering Equipment/Technique

    SciTech Connect

    Karekh, B K; Tao, D; Groppo, J G

    1998-08-28

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 mm) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy's program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 45 months beginning September 30, 1994. This report discusses technical progress made during the quarter from January 1 - March 31, 1998.

  9. Evaluation of river water quality variations using multivariate statistical techniques: Sava River (Croatia): a case study.

    PubMed

    Marinović Ruždjak, Andrea; Ruždjak, Domagoj

    2015-04-01

    For the evaluation of seasonal and spatial variations and the interpretation of a large and complex water quality dataset obtained during a 7-year monitoring program of the Sava River in Croatia, different multivariate statistical techniques were applied in this study. Basic statistical properties and correlations of 18 water quality parameters (variables) measured at 18 sampling sites (a total of 56,952 values) were examined. Correlations between air temperature and some water quality parameters were found in agreement with the previous studies of relationship between climatic and hydrological parameters. Principal component analysis (PCA) was used to explore the most important factors determining the spatiotemporal dynamics of the Sava River. PCA has determined a reduced number of seven principal components that explain over 75 % of the data set variance. The results revealed that parameters related to temperature and organic pollutants (CODMn and TSS) were the most important parameters contributing to water quality variation. PCA analysis of seasonal subsets confirmed this result and showed that the importance of parameters is changing from season to season. PCA of the four seasonal data subsets yielded six PCs with eigenvalues greater than one explaining 73.6 % (spring), 71.4 % (summer), 70.3 % (autumn), and 71.3 % (winter) of the total variance. To check the influence of the outliers in the data set whose distribution strongly deviates from the normal one, in addition to standard principal component analysis algorithm, two robust estimates of covariance matrix were calculated and subjected to PCA. PCA in both cases yielded seven principal components explaining 75 % of the total variance, and the results do not differ significantly from the results obtained by the standard PCA algorithm. With the implementation of robust PCA algorithm, it is demonstrated that the usage of standard algorithm is justified for data sets with small numbers of missing data

  10. Design and contents of an advanced distance-based statistics course for a PhD in nursing program.

    PubMed

    Azuero, Andres; Wilbanks, Bryan; Pryor, Erica

    2013-01-01

    Doctoral nursing students and researchers are expected to understand, critique, and conduct research that uses advanced quantitative methodology. The authors describe the design and contents of a distance-based course in multivariate statistics for PhD students in nursing and health administration, compare the design to recommendations found in the literature for distance-based statistics education, and compare the course contents to a tabulation of the methodologies used in a sample of recently published quantitative dissertations in nursing. The authors conclude with a discussion based on these comparisons as well as with experiences in course implementation and directions for future course development.

  11. Modulation/demodulation techniques for satellite communications. Part 3: Advanced techniques. The nonlinear channel

    NASA Technical Reports Server (NTRS)

    Omura, J. K.; Simon, M. K.

    1982-01-01

    A theory for deducing and predicting the performance of transmitter/receivers for bandwidth efficient modulations suitable for use on the nonlinear satellite channel is presented. The underlying principle used throughout is the development of receiver structures based on the maximum likelihood decision rule and aproximations to it. The bit error probability transfer function bounds developed in great detail in Part 4 is applied to these modulation/demodulation techniques. The effects of the various degrees of receiver mismatch are considered both theoretically and by numerous illustrative examples.

  12. Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996. Statistics in Brief.

    ERIC Educational Resources Information Center

    Heaviside, Sheila; And Others

    The "Survey of Advanced Telecommunications in U.S. Public Elementary and Secondary Schools, Fall 1996" collected information from 911 regular United States public elementary and secondary schools regarding the availability and use of advanced telecommunications, and in particular, access to the Internet, plans to obtain Internet access, use of…

  13. Application of Advanced Magnetic Resonance Imaging Techniques in Evaluation of the Lower Extremity

    PubMed Central

    Braun, Hillary J.; Dragoo, Jason L.; Hargreaves, Brian A.; Levenston, Marc E.; Gold, Garry E.

    2012-01-01

    Synopsis This article reviews current magnetic resonance imaging techniques for imaging the lower extremity, focusing on imaging of the knee, ankle, and hip joints. Recent advancements in MRI include imaging at 7 Tesla, using multiple receiver channels, T2* imaging, and metal suppression techniques, allowing more detailed visualization of complex anatomy, evaluation of morphological changes within articular cartilage, and imaging around orthopedic hardware. PMID:23622097

  14. Advances in neutron radiographic techniques and applications: a method for nondestructive testing.

    PubMed

    Berger, Harold

    2004-10-01

    A brief history of neutron radiography is presented to set the stage for a discussion of significant neutron radiographic developments and an assessment of future directions for neutron radiography. Specific advances are seen in the use of modern, high dynamic range imaging methods (image plates and flat panels) and for high contrast techniques such as phase contrast, and phase-sensitive imaging. Competition for neutron radiographic inspection may develop as these techniques offer application prospects for X-ray methods.

  15. Enhancing Local Climate Projections of Precipitation: Assets and Limitations of Quantile Mapping Techniques for Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Ivanov, Martin; Kotlarski, Sven; Schär, Christoph

    2015-04-01

    The Swiss CH2011 scenarios provide a portfolio of climate change scenarios for the region of Switzerland, specifically tailored for use in climate impact research. Although widely applied by a variety of end-users, these scenarios are subject to several limitations related to the underlying delta change methodology. Examples are difficulties to appropriately account for changes in the spatio-temporal variability of meteorological fields and for changes in extreme events. The recently launched ELAPSE project (Enhancing local and regional climate change projections for Switzerland) is connected to the EU COST Action VALUE (www.value-cost.eu) and aims at complementing CH2011 by further scenario products, including a bias-corrected version of daily scenarios at the site scale. For this purpose the well-established empirical quantile mapping (QM) methodology is employed. Here, daily temperature and precipitation output of 15 GCM-RCM model chains of the ENSEMBLES project is downscaled and bias-corrected to match observations at weather stations in Switzerland. We consider established QM techniques based on all empirical quantiles or linear interpolation between the empirical percentiles. In an attempt to improve the downscaling of extreme precipitation events, we also apply a parametric approximation of the daily precipitation distribution by a dynamically weighted mixture of a Gamma distribution for the bulk and a Pareto distribution for the right tail for the first time in the context of QM. All techniques are evaluated and intercompared in a cross-validation framework. The statistical downscaling substantially improves virtually all considered distributional and temporal characteristics as well as their spatial distribution. The empirical methods have in general very similar performances. The parametric method does not show an improvement over the empirical ones. Critical sites and seasons are highlighted and discussed. Special emphasis is placed on investigating the

  16. Application of multivariate statistical technique for hydrogeochemical assessment of groundwater within the Lower Pra Basin, Ghana

    NASA Astrophysics Data System (ADS)

    Tay, C. K.; Hayford, E. K.; Hodgson, I. O. A.

    2017-02-01

    Multivariate statistical technique and hydrogeochemical approach were employed for groundwater assessment within the Lower Pra Basin. The main objective was to delineate the main processes that are responsible for the water chemistry and pollution of groundwater within the basin. Fifty-four (54) (No) boreholes were sampled in January 2012 for quality assessment. PCA using Varimax with Kaiser Normalization method of extraction for both rotated space and component matrix have been applied to the data. Results show that Spearman's correlation matrix of major ions revealed expected process-based relationships derived mainly from the geochemical processes, such as ion-exchange and silicate/aluminosilicate weathering within the aquifer. Three main principal components influence the water chemistry and pollution of groundwater within the basin. The three principal components have accounted for approximately 79% of the total variance in the hydrochemical data. Component 1 delineates the main natural processes (water-soil-rock interactions) through which groundwater within the basin acquires its chemical characteristics, Component 2 delineates the incongruent dissolution of silicate/aluminosilicates, while Component 3 delineates the prevalence of pollution principally from agricultural input as well as trace metal mobilization in groundwater within the basin. The loadings and score plots of the first two PCs show grouping pattern which indicates the strength of the mutual relation among the hydrochemical variables. In terms of proper management and development of groundwater within the basin, communities, where intense agriculture is taking place, should be monitored and protected from agricultural activities. especially where inorganic fertilizers are used by creating buffer zones. Monitoring of the water quality especially the water pH is recommended to ensure the acid neutralizing potential of groundwater within the basin thereby, curtailing further trace metal

  17. Assessment of arsenic and heavy metal contents in cockles (Anadara granosa) using multivariate statistical techniques.

    PubMed

    Abbas Alkarkhi, F M; Ismail, Norli; Easa, Azhar Mat

    2008-02-11

    Cockles (Anadara granosa) sample obtained from two rivers in the Penang State of Malaysia were analyzed for the content of arsenic (As) and heavy metals (Cr, Cd, Zn, Cu, Pb, and Hg) using a graphite flame atomic absorption spectrometer (GF-AAS) for Cr, Cd, Zn, Cu, Pb, As and cold vapor atomic absorption spectrometer (CV-AAS) for Hg. The two locations of interest with 20 sampling points of each location were Kuala Juru (Juru River) and Bukit Tambun (Jejawi River). Multivariate statistical techniques such as multivariate analysis of variance (MANOVA) and discriminant analysis (DA) were applied for analyzing the data. MANOVA showed a strong significant difference between the two rivers in term of As and heavy metals contents in cockles. DA gave the best result to identify the relative contribution for all parameters in discriminating (distinguishing) the two rivers. It provided an important data reduction as it used only two parameters (Zn and Cd) affording more than 72% correct assignations. Results indicated that the two rivers were different in terms of As and heavy metal contents in cockle, and the major difference was due to the contribution of Zn and Cd. A positive correlation was found between discriminate functions (DF) and Zn, Cd and Cr, whereas negative correlation was exhibited with other heavy metals. Therefore, DA allowed a reduction in the dimensionality of the data set, delineating a few indicator parameters responsible for large variations in heavy metals and arsenic content. Taking into account of these results, it can be suggested that a continuous monitoring of As and heavy metals in cockles be performed in these two rivers.

  18. Advanced techniques for high resolution spectroscopic observations of cosmic gamma-ray sources

    NASA Technical Reports Server (NTRS)

    Matteson, J. L.; Pelling, M. R.; Peterson, L. E.; Lin, R. P.; Anderson, K. A.; Pehl, R. H.; Hurley, K. C.; Vedrenne, G.; Sniel, M.; Durouchoux, P.

    1985-01-01

    An advanced gamma-ray spectrometer that is currently in development is described. It will obtain a sensitivity of 0.0001 ph/sq cm./sec in a 6 hour balloon observation and uses innovative techniques for background reduction and source imaging.

  19. Recognizing and Managing Complexity: Teaching Advanced Programming Concepts and Techniques Using the Zebra Puzzle

    ERIC Educational Resources Information Center

    Crabtree, John; Zhang, Xihui

    2015-01-01

    Teaching advanced programming can be a challenge, especially when the students are pursuing different majors with diverse analytical and problem-solving capabilities. The purpose of this paper is to explore the efficacy of using a particular problem as a vehicle for imparting a broad set of programming concepts and problem-solving techniques. We…

  20. Real-time application of advanced three-dimensional graphic techniques for research aircraft simulation

    NASA Technical Reports Server (NTRS)

    Davis, Steven B.

    1990-01-01

    Visual aids are valuable assets to engineers for design, demonstration, and evaluation. Discussed here are a variety of advanced three-dimensional graphic techniques used to enhance the displays of test aircraft dynamics. The new software's capabilities are examined and possible future uses are considered.

  1. Fabrication of advanced electrochemical energy materials using sol-gel processing techniques

    NASA Technical Reports Server (NTRS)

    Chu, C. T.; Chu, Jay; Zheng, Haixing

    1995-01-01

    Advanced materials play an important role in electrochemical energy devices such as batteries, fuel cells, and electrochemical capacitors. They are being used as both electrodes and electrolytes. Sol-gel processing is a versatile solution technique used in fabrication of ceramic materials with tailored stoichiometry, microstructure, and properties. The application of sol-gel processing in the fabrication of advanced electrochemical energy materials will be presented. The potentials of sol-gel derived materials for electrochemical energy applications will be discussed along with some examples of successful applications. Sol-gel derived metal oxide electrode materials such as V2O5 cathodes have been demonstrated in solid-slate thin film batteries; solid electrolytes materials such as beta-alumina for advanced secondary batteries had been prepared by the sol-gel technique long time ago; and high surface area transition metal compounds for capacitive energy storage applications can also be synthesized with this method.

  2. An investigation of the feasibility of improving oculometer data analysis through application of advanced statistical techniques

    NASA Technical Reports Server (NTRS)

    Rana, D. S.

    1980-01-01

    The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.

  3. Parameter estimation techniques based on optimizing goodness-of-fit statistics for structural reliability

    NASA Technical Reports Server (NTRS)

    Starlinger, Alois; Duffy, Stephen F.; Palko, Joseph L.

    1993-01-01

    New methods are presented that utilize the optimization of goodness-of-fit statistics in order to estimate Weibull parameters from failure data. It is assumed that the underlying population is characterized by a three-parameter Weibull distribution. Goodness-of-fit tests are based on the empirical distribution function (EDF). The EDF is a step function, calculated using failure data, and represents an approximation of the cumulative distribution function for the underlying population. Statistics (such as the Kolmogorov-Smirnov statistic and the Anderson-Darling statistic) measure the discrepancy between the EDF and the cumulative distribution function (CDF). These statistics are minimized with respect to the three Weibull parameters. Due to nonlinearities encountered in the minimization process, Powell's numerical optimization procedure is applied to obtain the optimum value of the EDF. Numerical examples show the applicability of these new estimation methods. The results are compared to the estimates obtained with Cooper's nonlinear regression algorithm.

  4. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.

    1972-01-01

    The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.

  5. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1].

  6. Recent advances in 3D computed tomography techniques for simulation and navigation in hepatobiliary pancreatic surgery.

    PubMed

    Uchida, Masafumi

    2014-04-01

    A few years ago it could take several hours to complete a 3D image using a 3D workstation. Thanks to advances in computer science, obtaining results of interest now requires only a few minutes. Many recent 3D workstations or multimedia computers are equipped with onboard 3D virtual patient modeling software, which enables patient-specific preoperative assessment and virtual planning, navigation, and tool positioning. Although medical 3D imaging can now be conducted using various modalities, including computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), and ultrasonography (US) among others, the highest quality images are obtained using CT data, and CT images are now the most commonly used source of data for 3D simulation and navigation image. If the 2D source image is bad, no amount of 3D image manipulation in software will provide a quality 3D image. In this exhibition, the recent advances in CT imaging technique and 3D visualization of the hepatobiliary and pancreatic abnormalities are featured, including scan and image reconstruction technique, contrast-enhanced techniques, new application of advanced CT scan techniques, and new virtual reality simulation and navigation imaging.

  7. Statistical analyses of the magnet data for the advanced photon source storage ring magnets

    SciTech Connect

    Kim, S.H.; Carnegie, D.W.; Doose, C.; Hogrefe, R.; Kim, K.; Merl, R.

    1995-05-01

    The statistics of the measured magnetic data of 80 dipole, 400 quadrupole, and 280 sextupole magnets of conventional resistive designs for the APS storage ring is summarized. In order to accommodate the vacuum chamber, the curved dipole has a C-type cross section and the quadrupole and sextupole cross sections have 180{degrees} and 120{degrees} symmetries, respectively. The data statistics include the integrated main fields, multipole coefficients, magnetic and mechanical axes, and roll angles of the main fields. The average and rms values of the measured magnet data meet the storage ring requirements.

  8. Flipping the Classroom and Student Performance in Advanced Statistics: Evidence from a Quasi-Experiment

    ERIC Educational Resources Information Center

    Touchton, Michael

    2015-01-01

    I administer a quasi-experiment using undergraduate political science majors in statistics classes to evaluate whether "flipping the classroom" (the treatment) alters students' applied problem-solving performance and satisfaction relative to students in a traditional classroom environment (the control). I also assess whether general…

  9. An analytic technique for statistically modeling random atomic clock errors in estimation

    NASA Technical Reports Server (NTRS)

    Fell, P. J.

    1981-01-01

    Minimum variance estimation requires that the statistics of random observation errors be modeled properly. If measurements are derived through the use of atomic frequency standards, then one source of error affecting the observable is random fluctuation in frequency. This is the case, for example, with range and integrated Doppler measurements from satellites of the Global Positioning and baseline determination for geodynamic applications. An analytic method is presented which approximates the statistics of this random process. The procedure starts with a model of the Allan variance for a particular oscillator and develops the statistics of range and integrated Doppler measurements. A series of five first order Markov processes is used to approximate the power spectral density obtained from the Allan variance.

  10. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  11. Recent Advances in Techniques for Starch Esters and the Applications: A Review

    PubMed Central

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S.; Brennan, Margaret; Han, Zhong

    2016-01-01

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented. PMID:28231145

  12. Advances in the surface modification techniques of bone-related implants for last 10 years

    PubMed Central

    Qiu, Zhi-Ye; Chen, Cen; Wang, Xiu-Mei; Lee, In-Seop

    2014-01-01

    At the time of implanting bone-related implants into human body, a variety of biological responses to the material surface occur with respect to surface chemistry and physical state. The commonly used biomaterials (e.g. titanium and its alloy, Co–Cr alloy, stainless steel, polyetheretherketone, ultra-high molecular weight polyethylene and various calcium phosphates) have many drawbacks such as lack of biocompatibility and improper mechanical properties. As surface modification is very promising technology to overcome such problems, a variety of surface modification techniques have been being investigated. This review paper covers recent advances in surface modification techniques of bone-related materials including physicochemical coating, radiation grafting, plasma surface engineering, ion beam processing and surface patterning techniques. The contents are organized with different types of techniques to applicable materials, and typical examples are also described. PMID:26816626

  13. Recent Advances in Techniques for Starch Esters and the Applications: A Review.

    PubMed

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S; Brennan, Margaret; Han, Zhong

    2016-07-09

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented.

  14. Unified Instrumentation: Examining the Simultaneous Application of Advanced Measurement Techniques for Increased Wind Tunnel Testing Capability

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Editor); Bartram, Scott M.; Humphreys, William M., Jr.; Jenkins, Luther N.; Jordan, Jeffrey D.; Lee, Joseph W.; Leighty, Bradley D.; Meyers, James F.; South, Bruce W.; Cavone, Angelo A.; Ingram, JoAnne L.

    2002-01-01

    A Unified Instrumentation Test examining the combined application of Pressure Sensitive Paint, Projection Moire Interferometry, Digital Particle Image Velocimetry, Doppler Global Velocimetry, and Acoustic Microphone Array has been conducted at the NASA Langley Research Center. The fundamental purposes of conducting the test were to: (a) identify and solve compatibility issues among the techniques that would inhibit their simultaneous application in a wind tunnel, and (b) demonstrate that simultaneous use of advanced instrumentation techniques is feasible for increasing tunnel efficiency and identifying control surface actuation / aerodynamic reaction phenomena. This paper provides summary descriptions of each measurement technique used during the Unified Instrumentation Test, their implementation for testing in a unified fashion, and example results identifying areas of instrument compatibility and incompatibility. Conclusions are drawn regarding the conditions under which the measurement techniques can be operated simultaneously on a non-interference basis. Finally, areas requiring improvement for successfully applying unified instrumentation in future wind tunnel tests are addressed.

  15. Advanced combustion techniques for controlling NO/x/ emissions of high altitude cruise aircraft

    NASA Technical Reports Server (NTRS)

    Rudey, R. A.; Reck, G. M.

    1976-01-01

    An array of experiments have been and continue to be sponsored and conducted by NASA to explore the potential of advanced combustion techniques for controlling the emissions of aircraft into the upper atmosphere. Of particular concern are the oxides of nitrogen (NO/x/) emissions into the stratosphere. The experiments utilize a wide variety of approaches varying from advanced combustor concepts to fundamental flame tube experiments. Results are presented which indicate that substantial reductions in cruise NO/x/ emissions should be achievable in future aircraft engines. A major NASA program is described which focuses the many fundamental experiments into a planned evolution and demonstration of the prevaporized-premixed combustion technique in a full-scale engine.

  16. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    1998-09-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 pm) clean coal. Economical dewatering of an ultra-fine clean-coal product to a 20% level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20% or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 36 months beginning September 30, 1994. This report discusses technical progress made during the quarter from July 1 - September 30, 1997.

  17. Imaging of skull base pathologies: Role of advanced magnetic resonance imaging techniques

    PubMed Central

    Mathur, Ankit; Kesavadas, C; Thomas, Bejoy; Kapilamoorthy, TR

    2015-01-01

    Imaging plays a vital role in evaluation of skull base pathologies as this region is not directly accessible for clinical evaluation. Computerized tomography (CT) and magnetic resonance imaging (MRI) have played complementary roles in the diagnosis of the various neoplastic and non-neoplastic lesions of the skull base. However, CT and conventional MRI may at times be insufficient to correctly pinpoint the accurate diagnosis. Advanced MRI techniques, though difficult to apply in the skull base region, in conjunction with CT and conventional MRI can however help in improving the diagnostic accuracy. This article aims to highlight the importance of advanced MRI techniques like diffusion-weighted imaging, susceptibility-weighted imaging, perfusion-weighted imaging, and MR spectroscopy in differentiation of various lesions involving the skull base. PMID:26427895

  18. New test techniques and analytical procedures for understanding the behavior of advanced propellers

    NASA Technical Reports Server (NTRS)

    Stefko, G. L.; Bober, L. J.; Neumann, H. E.

    1983-01-01

    Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.

  19. Advanced digital modulation: Communication techniques and monolithic GaAs technology

    NASA Technical Reports Server (NTRS)

    Wilson, S. G.; Oliver, J. D., Jr.; Kot, R. C.; Richards, C. R.

    1983-01-01

    Communications theory and practice are merged with state-of-the-art technology in IC fabrication, especially monolithic GaAs technology, to examine the general feasibility of a number of advanced technology digital transmission systems. Satellite-channel models with (1) superior throughput, perhaps 2 Gbps; (2) attractive weight and cost; and (3) high RF power and spectrum efficiency are discussed. Transmission techniques possessing reasonably simple architectures capable of monolithic fabrication at high speeds were surveyed. This included a review of amplitude/phase shift keying (APSK) techniques and the continuous-phase-modulation (CPM) methods, of which MSK represents the simplest case.

  20. Development of advanced electron holographic techniques and application to industrial materials and devices.

    PubMed

    Yamamoto, Kazuo; Hirayama, Tsukasa; Tanji, Takayoshi

    2013-06-01

    The development of a transmission electron microscope equipped with a field emission gun paved the way for electron holography to be put to practical use in various fields. In this paper, we review three advanced electron holography techniques: on-line real-time electron holography, three-dimensional (3D) tomographic holography and phase-shifting electron holography, which are becoming important techniques for materials science and device engineering. We also describe some applications of electron holography to the analysis of industrial materials and devices: GaAs compound semiconductors, solid oxide fuel cells and all-solid-state lithium ion batteries.

  1. Combined preputial advancement and phallopexy as a revision technique for treating paraphimosis in a dog.

    PubMed

    Wasik, S M; Wallace, A M

    2014-11-01

    A 7-year-old neutered male Jack Russell terrier-cross was presented for signs of recurrent paraphimosis, despite previous surgical enlargement of the preputial ostium. Revision surgery was performed using a combination of preputial advancement and phallopexy, which resulted in complete and permanent coverage of the glans penis by the prepuce, and at 1 year postoperatively, no recurrence of paraphimosis had been observed. The combined techniques allow preservation of the normal penile anatomy, are relatively simple to perform and provide a cosmetic result. We recommend this combination for the treatment of paraphimosis in the dog, particularly when other techniques have failed.

  2. [Principles and advanced techniques for better internetpresentations in obstetrics and gynecology].

    PubMed

    Seufert, R; Molitor, N; Pollow, K; Woernle, F; Hawighorst-Knapstein, S

    2001-08-01

    Internet presentations are common tools for better medical communication and better scientific work. Meanwhile a great number of gynecological and obstetrical institutions present data via the world wide web within a wide range of quality and performance. Specific HTML editors offer quick and easy presentations, but only advanced internet techniques enable interesting multimedia presentations. N-tier applications are the future standard and we must integrate them in general informatical systems. New Concepts, actual tools and general problems will be discussed and new principles similar to actual E commerce techniques are able to solve our special medical demands.

  3. Noncompaction cardiomyopathy: The role of advanced multimodality imaging techniques in diagnosis and assessment.

    PubMed

    Chebrolu, Lakshmi H; Mehta, Anjlee M; Nanda, Navin C

    2017-02-01

    Noncompaction cardiomyopathy (NCCM) is a unique cardiomyopathy with a diverse array of genotypic and phenotypic manifestations. Its hallmark morphology consists of a bilayered myocardium with a compact epicardial layer and prominent trabeculations that comprise the noncompacted endocardial layer. The controversial diagnostic criteria for NCCM have been frequently discussed in the literature. This review touches on those diagnostic criteria, delves further into the evolving use of advanced imaging techniques within the major imaging modalities (echocardiography, computed tomography, and cardiac magnetic resonance imaging), and proposes an alternative algorithm incorporating these techniques for aiding with the diagnosis of NCCM.

  4. The investigation of advanced remote sensing techniques for the measurement of aerosol characteristics

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Becher, J.

    1979-01-01

    Advanced remote sensing techniques and inversion methods for the measurement of characteristics of aerosol and gaseous species in the atmosphere were investigated. Of particular interest were the physical and chemical properties of aerosols, such as their size distribution, number concentration, and complex refractive index, and the vertical distribution of these properties on a local as well as global scale. Remote sensing techniques for monitoring of tropospheric aerosols were developed as well as satellite monitoring of upper tropospheric and stratospheric aerosols. Computer programs were developed for solving multiple scattering and radiative transfer problems, as well as inversion/retrieval problems. A necessary aspect of these efforts was to develop models of aerosol properties.

  5. An example of requirements for Advanced Subsonic Civil Transport (ASCT) flight control system using structured techniques

    NASA Technical Reports Server (NTRS)

    Mclees, Robert E.; Cohen, Gerald C.

    1991-01-01

    The requirements are presented for an Advanced Subsonic Civil Transport (ASCT) flight control system generated using structured techniques. The requirements definition starts from initially performing a mission analysis to identify the high level control system requirements and functions necessary to satisfy the mission flight. The result of the study is an example set of control system requirements partially represented using a derivative of Yourdon's structured techniques. Also provided is a research focus for studying structured design methodologies and in particular design-for-validation philosophies.

  6. Study of advanced techniques for determining the long term performance of components

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The application of existing and new technology to the problem of determining the long-term performance capability of liquid rocket propulsion feed systems is discussed. The long term performance of metal to metal valve seats in a liquid propellant fuel system is stressed. The approaches taken in conducting the analysis are: (1) advancing the technology of characterizing components through the development of new or more sensitive techniques and (2) improving the understanding of the physical of degradation.

  7. STATISTICAL TECHNIQUES FOR DETERMINATION AND PREDICTION OF FUNDAMENTAL FISH ASSEMBLAGES OF THE MID-ATLANTIC HIGHLANDS

    EPA Science Inventory

    A statistical software tool, Stream Fish Community Predictor (SFCP), based on EMAP stream sampling in the mid-Atlantic Highlands, was developed to predict stream fish communities using stream and watershed characteristics. Step one in the tool development was a cluster analysis t...

  8. Statistical Techniques for Criterion-Referenced Tests. Final Report. October, 1976-October, 1977.

    ERIC Educational Resources Information Center

    Wilcox, Rand R.

    Three statistical problems related to criterion-referenced testing are investigated: estimation of the likelihood of a false-positive or false-negative decision with a mastery test, estimation of true scores in the Compound Binomial Error Model, and comparison of the examinees to a control. Two methods for estimating the likelihood of…

  9. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  10. The Random Forests Statistical Technique: An Examination of Its Value for the Study of Reading

    ERIC Educational Resources Information Center

    Matsuki, Kazunaga; Kuperman, Victor; Van Dyke, Julie A.

    2016-01-01

    Studies investigating individual differences in reading ability often involve data sets containing a large number of collinear predictors and a small number of observations. In this article, we discuss the method of Random Forests and demonstrate its suitability for addressing the statistical concerns raised by such data sets. The method is…

  11. Theory and analysis of statistical discriminant techniques as applied to remote sensing data

    NASA Technical Reports Server (NTRS)

    Odell, P. L.

    1973-01-01

    Classification of remote earth resources sensing data according to normed exponential density statistics is reported. The use of density models appropriate for several physical situations provides an exact solution for the probabilities of classifications associated with the Bayes discriminant procedure even when the covariance matrices are unequal.

  12. ADVANCING THE FUNDAMENTAL UNDERSTANDING AND SCALE-UP OF TRISO FUEL COATERS VIA ADVANCED MEASUREMENT AND COMPUTATIONAL TECHNIQUES

    SciTech Connect

    Biswas, Pratim; Al-Dahhan, Muthanna

    2012-11-01

    to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains

  13. Integrating Organic Matter Structure with Ecosystem Function using Advanced Analytical Chemistry Techniques

    NASA Astrophysics Data System (ADS)

    Boot, C. M.

    2012-12-01

    Microorganisms are the primary transformers of organic matter in terrestrial and aquatic ecosystems. The structure of organic matter controls its bioavailability and researchers have long sought to link the chemical characteristics of the organic matter pool to its lability. To date this effort has been primarily attempted using low resolution descriptive characteristics (e.g. organic matter content, carbon to nitrogen ratio, aromaticity, etc .). However, recent progress in linking these two important ecosystem components has been advanced using advanced high resolution tools (e.g. nuclear magnetic resonance (NMR) spectroscopy, and mass spectroscopy (MS)-based techniques). A series of experiments will be presented that highlight the application of high resolution techniques in a variety of terrestrial and aquatic ecosystems with the focus on how these data explicitly provide the foundation for integrating organic matter structure into our concept of ecosystem function. The talk will highlight results from a series of experiments including: an MS-based metabolomics and fluorescence excitation emission matrix approach evaluating seasonal and vegetation based changes in dissolved organic matter (DOM) composition from arctic soils; Fourier transform ion cyclotron resonance (FTICR) MS and MS metabolomics analysis of DOM from three lakes in an alpine watershed; and the transformation of 13C labeled glucose track with NMR during a rewetting experiment from Colorado grassland soils. These data will be synthesized to illustrate how the application of advanced analytical techniques provides novel insight into our understanding of organic matter processing in a wide range of ecosystems.

  14. Potential of advanced MR imaging techniques in the differential diagnosis of parkinsonism.

    PubMed

    Hotter, Anna; Esterhammer, Regina; Schocke, Michael F H; Seppi, Klaus

    2009-01-01

    The clinical differentiation of parkinsonian syndromes remains challenging not only for neurologists but also for movement disorder specialists. Conventional magnetic resonance imaging (cMRI) with the visual assessment of T2- and T1-weighted imaging as well as different advanced MRI techniques offer objective measures, which may be a useful tool in the diagnostic work-up of Parkinson's disease and atypical parkinsonian disorders (APDs). In clinical practice, cMRI is a well-established method for the exclusion of symptomatic parkinsonism due to other pathologies. Over the past two decades, abnormalities in the basal ganglia and infratentorial structures have been shown especially in APDs not only by cMRI but also by different advanced MRI techniques, including methods to assess regional cerebral atrophy quantitatively such as magnetic resonance volumetry, proton magnetic resonance spectroscopy, diffusion-weighted imaging, and magnetization transfer imaging. This article aims to review recent research findings on the role of advanced MRI techniques in the differential diagnosis of neurodegenerative parkinsonian disorders.

  15. Advanced in situ spectroscopic techniques and their applications in environmental biogeochemistry: introduction to the special section.

    PubMed

    Lombi, Enzo; Hettiarachchi, Ganga M; Scheckel, Kirk G

    2011-01-01

    Understanding the molecular-scale complexities and interplay of chemical and biological processes of contaminants at solid, liquid, and gas interfaces is a fundamental and crucial element to enhance our understanding of anthropogenic environmental impacts. The ability to describe the complexity of environmental biogeochemical reaction mechanisms relies on our analytical ability through the application and developmemnt of advanced spectroscopic techniques. Accompanying this introductory article are nine papers that either review advanced in situ spectroscopic methods or present original research utilizing these techniques. This collection of articles summarizes the challenges facing environmental biogeochemistry, highlights the recent advances and scientific gaps, and provides an outlook into future research that may benefit from the use of in situ spectroscopic approaches. The use of synchrotron-based techniques and other methods are discussed in detail, as is the importance to integrate multiple analytical approaches to confirm results of complementary procedures or to fill data gaps. We also argue that future direction in research will be driven, in addition to recent analytical developments, by emerging factors such as the need for risk assessment of new materials (i.e., nanotechnologies) and the realization that biogeochemical processes need to be investigated in situ under environmentally relevant conditions.

  16. POC-scale testing of an advanced fine coal dewatering equipment/technique

    SciTech Connect

    Groppo, J.G.; Parekh, B.K.; Rawls, P.

    1995-11-01

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 {mu}m) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20 percent level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20 percent or lower moisture using either conventional or advanced dewatering techniques. As the contract title suggests, the main focus of the program is on proof-of-concept testing of a dewatering technique for a fine clean coal product. The coal industry is reluctant to use the advanced fine coal recovery technology due to the non-availability of an economical dewatering process. in fact, in a recent survey conducted by U.S. DOE and Battelle, dewatering of fine clean coal was identified as the number one priority for the coal industry. This project will attempt to demonstrate an efficient and economic fine clean coal slurry dewatering process.

  17. A Kernel of Truth: Statistical Advances in Polygenic Variance Component Models for Complex Human Pedigrees

    PubMed Central

    Blangero, John; Diego, Vincent P.; Dyer, Thomas D.; Almeida, Marcio; Peralta, Juan; Kent, Jack W.; Williams, Jeff T.; Almasy, Laura; Göring, Harald H. H.

    2014-01-01

    Statistical genetic analysis of quantitative traits in large pedigrees is a formidable computational task due to the necessity of taking the non-independence among relatives into account. With the growing awareness that rare sequence variants may be important in human quantitative variation, heritability and association study designs involving large pedigrees will increase in frequency due to the greater chance of observing multiple copies of rare variants amongst related individuals. Therefore, it is important to have statistical genetic test procedures that utilize all available information for extracting evidence regarding genetic association. Optimal testing for marker/phenotype association involves the exact calculation of the likelihood ratio statistic which requires the repeated inversion of potentially large matrices. In a whole genome sequence association context, such computation may be prohibitive. Toward this end, we have developed a rapid and efficient eigensimplification of the likelihood that makes analysis of family data commensurate with the analysis of a comparable sample of unrelated individuals. Our theoretical results which are based on a spectral representation of the likelihood yield simple exact expressions for the expected likelihood ratio test statistic (ELRT) for pedigrees of arbitrary size and complexity. For heritability, the ELRT is: −∑ln[1+ĥ2(λgi−1)], where ĥ2 and λgi are respectively the heritability and eigenvalues of the pedigree-derived genetic relationship kernel (GRK). For association analysis of sequence variants, the ELRT is given by ELRT[hq2>0:unrelateds]−(ELRT[ht2>0:pedigrees]−ELRT[hr2>0:pedigrees]), where ht2,hq2, and hr2 are the total, quantitative trait nucleotide, and residual heritabilities, respectively. Using these results, fast and accurate analytical power analyses are possible, eliminating the need for computer simulation. Additional benefits of eigensimplification include a simple method for

  18. Statistical Techniques for Analyzing Process or "Similarity" Data in TID Hardness Assurance

    NASA Technical Reports Server (NTRS)

    Ladbury, R.

    2010-01-01

    We investigate techniques for estimating the contributions to TID hardness variability for families of linear bipolar technologies, determining how part-to-part and lot-to-lot variability change for different part types in the process.

  19. Advanced Time-Resolved Fluorescence Microscopy Techniques for the Investigation of Peptide Self-Assembly

    NASA Astrophysics Data System (ADS)

    Anthony, Neil R.

    The ubiquitous cross beta sheet peptide motif is implicated in numerous neurodegenerative diseases while at the same time offers remarkable potential for constructing isomorphic high-performance bionanomaterials. Despite an emerging understanding of the complex folding landscape of cross beta structures in determining disease etiology and final structure, we lack knowledge of the critical initial stages of nucleation and growth. In this dissertation, I advance our understanding of these key stages in the cross-beta nucleation and growth pathways using cutting-edge microscopy techniques. In addition, I present a new combined time-resolved fluorescence analysis technique with the potential to advance our current understanding of subtle molecular level interactions that play a pivotal role in peptide self-assembly. Using the central nucleating core of Alzheimer's Amyloid-beta protein, Abeta(16 22), as a model system, utilizing electron, time-resolved, and non-linear microscopy, I capture the initial and transient nucleation stages of peptide assembly into the cross beta motif. In addition, I have characterized the nucleation pathway, from monomer to paracrystalline nanotubes in terms of morphology and fluorescence lifetime, corroborating the predicted desolvation process that occurs prior to cross-beta nucleation. Concurrently, I have identified unique heterogeneous cross beta domains contained within individual nanotube structures, which have potential bionanomaterials applications. Finally, I describe a combined fluorescence theory and analysis technique that dramatically increases the sensitivity of current time-resolved techniques. Together these studies demonstrate the potential for advanced microscopy techniques in the identification and characterization of the cross-beta folding pathway, which will further our understanding of both amyloidogenesis and bionanomaterials.

  20. Application of multivariate statistical techniques for differentiation of ripe banana flour based on the composition of elements.

    PubMed

    Alkarkhi, Abbas F M; Ramli, Saifullah Bin; Easa, Azhar Mat

    2009-01-01

    Major (sodium, potassium, calcium, magnesium) and minor elements (iron, copper, zinc, manganese) and one heavy metal (lead) of Cavendish banana flour and Dream banana flour were determined, and data were analyzed using multivariate statistical techniques of factor analysis and discriminant analysis. Factor analysis yielded four factors explaining more than 81% of the total variance: the first factor explained 28.73%, comprising magnesium, sodium, and iron; the second factor explained 21.47%, comprising only manganese and copper; the third factor explained 15.66%, comprising zinc and lead; while the fourth factor explained 15.50%, comprising potassium. Discriminant analysis showed that magnesium and sodium exhibited a strong contribution in discriminating the two types of banana flour, affording 100% correct assignation. This study presents the usefulness of multivariate statistical techniques for analysis and interpretation of complex mineral content data from banana flour of different varieties.

  1. Statistical Technique for Intermediate and Long-Range Estimation of 13-Month Smoothed Solar Flux and Geomagnetic Index

    NASA Technical Reports Server (NTRS)

    Niehuss, K. O.; Euler, H. C., Jr.; Vaughan, W. W.

    1996-01-01

    This report documents the Marshall Space Flight Center (MSFC) 13-month smoothed solar flux (F(sub 10.7)) and geomagnetic index (A(sub p)) intermediate (months) and long-range (years) statistical estimation technique, referred to as the MSFC Lagrangian Linear Regression Technique (MLLRT). Estimates of future solar activity are needed as updated input to upper atmosphere density models used for satellite and spacecraft orbital lifetime predictions. An assessment of the MLLRT computer program's products is provided for 5-year periods from the date estimates were made. This was accomplished for a number of past solar cycles.

  2. Advancing the Science of Spatial Neglect Rehabilitation: An Improved Statistical Approach with Mixed Linear Modeling

    PubMed Central

    Goedert, Kelly M.; Boston, Raymond C.; Barrett, A. M.

    2013-01-01

    Valid research on neglect rehabilitation demands a statistical approach commensurate with the characteristics of neglect rehabilitation data: neglect arises from impairment in distinct brain networks leading to large between-subject variability in baseline symptoms and recovery trajectories. Studies enrolling medically ill, disabled patients, may suffer from missing, unbalanced data, and small sample sizes. Finally, assessment of rehabilitation requires a description of continuous recovery trajectories. Unfortunately, the statistical method currently employed in most studies of neglect treatment [repeated measures analysis of variance (ANOVA), rANOVA] does not well-address these issues. Here we review an alternative, mixed linear modeling (MLM), that is more appropriate for assessing change over time. MLM better accounts for between-subject heterogeneity in baseline neglect severity and in recovery trajectory. MLM does not require complete or balanced data, nor does it make strict assumptions regarding the data structure. Furthermore, because MLM better models between-subject heterogeneity it often results in increased power to observe treatment effects with smaller samples. After reviewing current practices in the field, and the assumptions of rANOVA, we provide an introduction to MLM. We review its assumptions, uses, advantages, and disadvantages. Using real and simulated data, we illustrate how MLM may improve the ability to detect effects of treatment over ANOVA, particularly with the small samples typical of neglect research. Furthermore, our simulation analyses result in recommendations for the design of future rehabilitation studies. Because between-subject heterogeneity is one important reason why studies of neglect treatments often yield conflicting results, employing statistical procedures that model this heterogeneity more accurately will increase the efficiency of our efforts to find treatments to improve the lives of individuals with neglect. PMID

  3. New advances in methodology for statistical tests useful in geostatistical studies

    SciTech Connect

    Borgman, L.E.

    1988-05-01

    Methodology for statistical procedures to perform tests of hypothesis pertaining to various aspects of geostatistical investigations has been slow in developing. The correlated nature of the data precludes most classical tests and makes the design of new tests difficult. Recent studies have led to modifications of the classical t test which allow for the intercorrelation. In addition, results for certain nonparametric tests have been obtained. The conclusions of these studies provide a variety of new tools for the geostatistician in deciding questions on significant differences and magnitudes.

  4. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    assume that the NSMS can be approximated by a series of expansion functions F m (  ) such that (  )   m F m (  ) m1 M  (31) UXO...a receiver coil is the electromotive force given by the negative of the time derivative of the secondary magnetic flux through the coil. Since the...statistical signal processing MM-1572 Final Report Sky Research, Inc. January 2012 52 A support vector machine learns from data: when fed a series

  5. Generic Techniques for the Calibration of Robots with Application of the 3-D Fixtures and Statistical Technique on the PUMA 500 and ARID Robots

    NASA Technical Reports Server (NTRS)

    Tawfik, Hazem

    1991-01-01

    A relatively simple, inexpensive, and generic technique that could be used in both laboratories and some operation site environments is introduced at the Robotics Applications and Development Laboratory (RADL) at Kennedy Space Center (KSC). In addition, this report gives a detailed explanation of the set up procedure, data collection, and analysis using this new technique that was developed at the State University of New York at Farmingdale. The technique was used to evaluate the repeatability, accuracy, and overshoot of the Unimate Industrial Robot, PUMA 500. The data were statistically analyzed to provide an insight into the performance of the systems and components of the robot. Also, the same technique was used to check the forward kinematics against the inverse kinematics of RADL's PUMA robot. Recommendations were made for RADL to use this technique for laboratory calibration of the currently existing robots such as the ASEA, high speed controller, Automated Radiator Inspection Device (ARID) etc. Also, recommendations were made to develop and establish other calibration techniques that will be more suitable for site calibration environment and robot certification.

  6. Applications of Advanced Nondestructive Measurement Techniques to Address Safety of Flight Issues on NASA Spacecraft

    NASA Technical Reports Server (NTRS)

    Prosser, Bill

    2016-01-01

    Advanced nondestructive measurement techniques are critical for ensuring the reliability and safety of NASA spacecraft. Techniques such as infrared thermography, THz imaging, X-ray computed tomography and backscatter X-ray are used to detect indications of damage in spacecraft components and structures. Additionally, sensor and measurement systems are integrated into spacecraft to provide structural health monitoring to detect damaging events that occur during flight such as debris impacts during launch and assent or from micrometeoroid and orbital debris, or excessive loading due to anomalous flight conditions. A number of examples will be provided of how these nondestructive measurement techniques have been applied to resolve safety critical inspection concerns for the Space Shuttle, International Space Station (ISS), and a variety of launch vehicles and unmanned spacecraft.

  7. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate ('dynamic fatigue') testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rate in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  8. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    NASA Technical Reports Server (NTRS)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  9. Deriving Criteria-supporting Benchmark Values from Empirical Response Relationships: Comparison of Statistical Techniques and Effect of Log-transforming the Nutrient Variable

    EPA Science Inventory

    In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...

  10. Recent advances in statistical methods for the estimation of sediment and nutrient transport in rivers

    NASA Astrophysics Data System (ADS)

    Colin, T. A.

    1995-07-01

    This paper reviews advances in methods for estimating fluvial transport of suspended sediment and nutrients. Research from the past four years, mostly dealing with estimating monthly and annual loads, is emphasized. However, because this topic has not appeared in previous IUGG reports, some research prior to 1990 is included. The motivation for studying sediment transport has shifted during the past few decades. In addition to its role in filling reservoirs and channels, sediment is increasingly recognized as an important part of fluvial ecosystems and estuarine wetlands. Many groups want information about sediment transport [Bollman, 1992]: Scientists trying to understand benthic biology and catchment hydrology; citizens and policy-makers concerned about environmental impacts (e.g. impacts of logging [Beschta, 1978] or snow-fences [Sturges, 1992]); government regulators considering the effectiveness of programs to protect in-stream habitat and downstream waterbodies; and resource managers seeking to restore wetlands.

  11. Comparison of machine learning techniques with classical statistical models in predicting health outcomes.

    PubMed

    Song, Xiaowei; Mitnitski, Arnold; Cox, Jafna; Rockwood, Kenneth

    2004-01-01

    Several machine learning techniques (multilayer and single layer perceptron, logistic regression, least square linear separation and support vector machines) are applied to calculate the risk of death from two biomedical data sets, one from patient care records, and another from a population survey. Each dataset contained multiple sources of information: history of related symptoms and other illnesses, physical examination findings, laboratory tests, medications (patient records dataset), health attitudes, and disabilities in activities of daily living (survey dataset). Each technique showed very good mortality prediction in the acute patients data sample (AUC up to 0.89) and fair prediction accuracy for six year mortality (AUC from 0.70 to 0.76) in individuals from epidemiological database surveys. The results suggest that the nature of data is of primary importance rather than the learning technique. However, the consistently superior performance of the artificial neural network (multi-layer perceptron) indicates that nonlinear relationships (which cannot be discerned by linear separation techniques) can provide additional improvement in correctly predicting health outcomes.

  12. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  13. Dynamic statistical optimization of GNSS radio occultation bending angles: advanced algorithm and performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-08-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS)-based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAllenging Minisatellite Payload (CHAMP) and Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction of random errors (standard deviations) of optimized bending angles down to about half of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; and (4) realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well-characterized and high-quality atmospheric profiles over the entire stratosphere.

  14. Dynamic statistical optimization of GNSS radio occultation bending angles: an advanced algorithm and its performance analysis

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kirchengast, G.; Scherllin-Pirscher, B.; Norman, R.; Yuan, Y. B.; Fritzer, J.; Schwaerz, M.; Zhang, K.

    2015-01-01

    We introduce a new dynamic statistical optimization algorithm to initialize ionosphere-corrected bending angles of Global Navigation Satellite System (GNSS) based radio occultation (RO) measurements. The new algorithm estimates background and observation error covariance matrices with geographically-varying uncertainty profiles and realistic global-mean correlation matrices. The error covariance matrices estimated by the new approach are more accurate and realistic than in simplified existing approaches and can therefore be used in statistical optimization to provide optimal bending angle profiles for high-altitude initialization of the subsequent Abel transform retrieval of refractivity. The new algorithm is evaluated against the existing Wegener Center Occultation Processing System version 5.6 (OPSv5.6) algorithm, using simulated data on two test days from January and July 2008 and real observed CHAMP and COSMIC measurements from the complete months of January and July 2008. The following is achieved for the new method's performance compared to OPSv5.6: (1) significant reduction in random errors (standard deviations) of optimized bending angles down to about two-thirds of their size or more; (2) reduction of the systematic differences in optimized bending angles for simulated MetOp data; (3) improved retrieval of refractivity and temperature profiles; (4) produces realistically estimated global-mean correlation matrices and realistic uncertainty fields for the background and observations. Overall the results indicate high suitability for employing the new dynamic approach in the processing of long-term RO data into a reference climate record, leading to well characterized and high-quality atmospheric profiles over the entire stratosphere.

  15. An assessment of water quality in the Coruh Basin (Turkey) using multivariate statistical techniques.

    PubMed

    Bilgin, Ayla

    2015-11-01

    The purpose of this study was to assess the impact of 24 water parameters, measured semi-annually between 2011 and 2013 in Coruh Basin (Turkey), based on the quality of the water. The study utilised analysis of variance (ANOVA), principal component analysis (PCA) and factor analysis (FA) methods. The water-quality data was obtained from a total of four sites by the 26th Regional Directorate of the State Hydraulic Works (DSI). ANOVA was carried out to identify the differences between the parameters at the different measuring sites. The variables were classified using factor analysis, and at the end of the ANOVA test, it was established that there was a statistically significant difference between the downstream and upstream waste waters released by the Black Sea copper companies and between the Murgul and Borcka Dams, in terms of water quality, while no statistically significant difference was observed between the Murgul and Borcka Dams. It was determined through factor analysis that five factors explained 81.3% of the total variance. It was concluded that domestic, industrial and agricultural activities, in combination with physicochemical properties, were factors affecting the quality of the water in the Coruh Basin.

  16. System engineering techniques for establishing balanced design and performance guidelines for the advanced telerobotic testbed

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Matijevic, J. R.

    1987-01-01

    Novel system engineering techniques have been developed and applied to establishing structured design and performance objectives for the Telerobotics Testbed that reduce technical risk while still allowing the testbed to demonstrate an advancement in state-of-the-art robotic technologies. To estblish the appropriate tradeoff structure and balance of technology performance against technical risk, an analytical data base was developed which drew on: (1) automation/robot-technology availability projections, (2) typical or potential application mission task sets, (3) performance simulations, (4) project schedule constraints, and (5) project funding constraints. Design tradeoffs and configuration/performance iterations were conducted by comparing feasible technology/task set configurations against schedule/budget constraints as well as original program target technology objectives. The final system configuration, task set, and technology set reflected a balanced advancement in state-of-the-art robotic technologies, while meeting programmatic objectives and schedule/cost constraints.

  17. Mini-dental implant insertion with the auto-advance technique for ongoing applications.

    PubMed

    Balkin, B E; Steflik, D E; Naval, F

    2001-01-01

    The clinical and histological results of two cases demonstrating retrieved Sendax mini-dental implants in two different patients is the focus of this report. The mini-dental implants were inserted using the auto-advance technique and loaded immediately. The implants were retrieved at 4 months following insertion and at 5 months following insertion and were prepared and reviewed histologically. Clinically, the implants had no mobility, with no apparent exudate or bleeding upon probing, prior to removal. At the time explant procedures were performed, the mini-dental implants had provided immediate support for prostheses during the integration of traditional root-form endosteal implants. Upon explantation, the mini-dental implants were in a state of health and functioning in their intended purpose. Histologically, the bone appeared to be integrated to the surface of the implant at the light microscope level, and the bone appeared to be relatively mature and healthy in the areas observed, more so than one would expect in this amount of time from insertion of mini-dental implants with immediate loading. A discussion of the purposes and technique used for insertion and removal of these mini-dental implants is discussed. This is the first human histological report on the auto-advance technique with immediate loading of mini-dental implants, demonstrating feasibility in ongoing applications.

  18. State-of-the-art characterization techniques for advanced lithium-ion batteries

    NASA Astrophysics Data System (ADS)

    Lu, Jun; Wu, Tianpin; Amine, Khalil

    2017-03-01

    To meet future needs for industries from personal devices to automobiles, state-of-the-art rechargeable lithium-ion batteries will require both improved durability and lowered costs. To enhance battery performance and lifetime, understanding electrode degradation mechanisms is of critical importance. Various advanced in situ and operando characterization tools developed during the past few years have proven indispensable for optimizing battery materials, understanding cell degradation mechanisms, and ultimately improving the overall battery performance. Here we review recent progress in the development and application of advanced characterization techniques such as in situ transmission electron microscopy for high-performance lithium-ion batteries. Using three representative electrode systems—layered metal oxides, Li-rich layered oxides and Si-based or Sn-based alloys—we discuss how these tools help researchers understand the battery process and design better battery systems. We also summarize the application of the characterization techniques to lithium-sulfur and lithium-air batteries and highlight the importance of those techniques in the development of next-generation batteries.

  19. Advanced Molecular Diagnostic Techniques for Detection of Food-borne Pathogens; Current Applications and Future Challenges.

    PubMed

    Umesha, S; Manukumar, H M

    2016-01-08

    The elimination of disease-causing microbes from the food supply is a primary goal and this review deals with the overall techniques availavle for detection of food-borne pathogens. Now-a-days conventional methods are replaced by advanced methods like Biosensors, Nucleic Acid-based Tests (NAT) and different PCR based techniques used in molecular biology to identify specific pathogens. Bacillus cereus, Staphylococcus aureus, Proteus vulgaris, Escherichia coli, Campylobacter, Listeria monocytogenes, Salmonella spp, Aspergillus spp. Fusarium spp. Penicillium spp., and pathogens are detected in contaminated food items which cause always diseases in human in any one or the other way. Identification of food-borne pathogens in a short period of time is still a challenge to the scientific field in general and food technology in particular. The low level of food contamination by major pathogens requires specific sensitive detection platforms and the present area of hot research looking forward to new nanomolecular techniques for nanomaterials, make them suitable for the development of assays with high sensitivity, response time and portability. With the sound of these we attemet to highlight a comprehensive overview about food-borne pathogen detection by rapid, sensitive, accurate and cost affordable in situ analytical methods from conventional methods to recent molecular approaches for advanced food and microbiology research.

  20. Discrimination of nylon polymers using attenuated total reflection mid-infrared spectra and multivariate statistical techniques.

    PubMed

    Enlow, Elizabeth M; Kennedy, Jennifer L; Nieuwland, Alexander A; Hendrix, James E; Morgan, Stephen L

    2005-08-01

    Nylons are an important class of synthetic polymers, from an industrial, as well as forensic, perspective. A spectroscopic method, such as Fourier transform infrared (FT-IR) spectroscopy, is necessary to determine the nylon subclasses (e. g., nylon 6 or nylon 6,6). Library searching using absolute difference and absolute derivative difference algorithms gives inconsistent results for identifying nylon subclasses. The objective of this study was to evaluate the usefulness of peak ratio analysis and multivariate statistics for the identification of nylon subclasses using attenuated total reflection (ATR) spectral data. Many nylon subclasses could not be distinguished by the peak ratio of the N-H vibrational stretch to the sp(3) C-H(2) vibrational stretch intensities. Linear discriminant analysis, however, provided a graphical visualization of differences between nylon subclasses and was able to correctly classify a set of 270 spectra from eight different subclasses with 98.5% cross-validated accuracy.

  1. A Novel Statistical Technique for Determining the Properties of Exrasolar Planets

    NASA Astrophysics Data System (ADS)

    Starr Henderson, Cassandra; Skemer, Andrew; Morley, Caroline; Fortney, Jonathan J.

    2017-01-01

    By detecting light from extrasolar planets, we can measure their compositions and bulk physical properties. The technologies used to make these measurements are still in their infancy, and a lack of self-consistency suggests that previous observations have underestimated their systematic errors. We demonstrate a statistical method, newly applied to exoplanet characterization, which allows some amount of the data to have underestimated errorbars. This method compares the photometry on the substellar companion GJ 758b to custom atmospheric models to determine the exoplanet's atmospheric properties. It also demonstrates that some of the data is inconsistent with the models, and produces a probability distribution of atmospheric properties including temperature, gravity, cloud thickness, and chemical abundance for GJ 758b which automatically weights the photometry by the probability that it is correct at each wavelength.

  2. A statistical technique for determining rainfall over land employing Nimbus-6 ESMR measurements

    NASA Technical Reports Server (NTRS)

    Rodgers, E.; Siddalingaiah, H.; Chang, A. T. C.; Wilheit, T. T.

    1978-01-01

    At 37 GHz, the frequency at which the Nimbus 6 Electrically Scanning Microwave Radiometer (ESMR 6) measures upwelling radiance, it was shown theoretically that the atmospheric scattering and the relative independence on electromagnetic polarization of the radiances emerging from hydrometers make it possible to monitor remotely active rainfall over land. In order to verify experimentally these theoretical findings and to develop an algorithm to monitor rainfall over land, the digitized ESMR 6 measurements were examined statistically. Horizontally and vertically polarized brightness temperature pairs (TH, TV) from ESMR 6 were sampled for areas of rainfall over land as determined from the rain recording stations and the WSR 57 radar, and areas of wet and dry ground (whose thermodynamic temperatures were greater than 5 C) over the Southeastern United States. These three categories of brightness temperatures were found to be significantly different in the sense that the chances that the mean vectors of any two populations coincided were less than 1 in 100.

  3. A statistical comparison of two carbon fiber/epoxy fabrication techniques

    NASA Technical Reports Server (NTRS)

    Hodge, A. J.

    1991-01-01

    A statistical comparison of the compression strengths of specimens that were fabricated by either a platen press or an autoclave were performed on IM6/3501-6 carbon/epoxy composites of 16-ply (0,+45,90,-45)(sub S2) lay-up configuration. The samples were cured with the same parameters and processing materials. It was found that the autoclaved panels were thicker than the platen press cured samples. Two hundred samples of each type of cure process were compression tested. The autoclaved samples had an average strength of 450 MPa (65.5 ksi), while the press cured samples had an average strength of 370 MPa (54.0 ksi). A Weibull analysis of the data showed that there is only a 30 pct. probability that the two types of cure systems yield specimens that can be considered from the same family.

  4. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques

    PubMed Central

    Benson, Nsikak U.; Asuquo, Francis E.; Williams, Akan B.; Essien, Joseph P.; Ekong, Cyril I.; Akpabio, Otobong; Olajire, Abaas A.

    2016-01-01

    Trace metals (Cd, Cr, Cu, Ni and Pb) concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria). The degree of contamination was assessed using the individual contamination factors (ICF) and global contamination factor (GCF). Multivariate statistical approaches including principal component analysis (PCA), cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources. PMID:27257934

  5. Improved equilibrium reconstructions by advanced statistical weighting of the internal magnetic measurements.

    PubMed

    Murari, A; Gelfusa, M; Peluso, E; Gaudio, P; Mazon, D; Hawkes, N; Point, G; Alper, B; Eich, T

    2014-12-01

    In a Tokamak the configuration of the magnetic fields remains the key element to improve performance and to maximise the scientific exploitation of the device. On the other hand, the quality of the reconstructed fields depends crucially on the measurements available. Traditionally in the least square minimisation phase of the algorithms, used to obtain the magnetic field topology, all the diagnostics are given the same weights, a part from a corrective factor taking into account the error bars. This assumption unduly penalises complex diagnostics, such as polarimetry, which have a limited number of highly significant measurements. A completely new method to choose the weights, to be given to the internal measurements of the magnetic fields for improved equilibrium reconstructions, is presented in this paper. The approach is based on various statistical indicators applied to the residuals, the difference between the actual measurements and their estimates from the reconstructed equilibrium. The potential of the method is exemplified using the measurements of the Faraday rotation derived from JET polarimeter. The results indicate quite clearly that the weights have to be determined carefully, since the inappropriate choice can have significant repercussions on the quality of the magnetic reconstruction both in the edge and in the core. These results confirm the limitations of the assumption that all the diagnostics have to be given the same weight, irrespective of the number of measurements they provide and the region of the plasma they probe.

  6. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for Column CO2 Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, J. F.; Lin, B.; Nehrir, A. R.; Harrison, F. W.; Obland, M. D.; Ismail, S.; Meadows, B.; Browell, E. V.

    2014-12-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper.

  7. Extrusion based rapid prototyping technique: an advanced platform for tissue engineering scaffold fabrication.

    PubMed

    Hoque, M Enamul; Chuan, Y Leng; Pashby, Ian

    2012-02-01

    Advances in scaffold design and fabrication technology have brought the tissue engineering field stepping into a new era. Conventional techniques used to develop scaffolds inherit limitations, such as lack of control over the pore morphology and architecture as well as reproducibility. Rapid prototyping (RP) technology, a layer-by-layer additive approach offers a unique opportunity to build complex 3D architectures overcoming those limitations that could ultimately be tailored to cater for patient-specific applications. Using RP methods, researchers have been able to customize scaffolds to mimic the biomechanical properties (in terms of structural integrity, strength, and microenvironment) of the organ or tissue to be repaired/replaced quite closely. This article provides intensive description on various extrusion based scaffold fabrication techniques and review their potential utility for TE applications. The extrusion-based technique extrudes the molten polymer as a thin filament through a nozzle onto a platform layer-by-layer and thus building 3D scaffold. The technique allows full control over pore architecture and dimension in the x- and y- planes. However, the pore height in z-direction is predetermined by the extruding nozzle diameter rather than the technique itself. This review attempts to assess the current state and future prospects of this technology.

  8. Investigating statistical techniques to infer interwell connectivity from production and injection rate fluctuations

    NASA Astrophysics Data System (ADS)

    Al-Yousef, Ali Abdallah

    Reservoir characterization is one of the most important factors in successful reservoir management. In water injection projects, a knowledge of reservoir heterogeneities and discontinuities is particularly important to maximize oil recovery. This research project presents a new technique to quantify communication between injection and production wells in a reservoir based on temporal fluctuations in rates. The technique combines a nonlinear signal processing model and multiple linear regression (MLR) to provide information about permeability trends and the presence of flow barriers. The method was tested in synthetic fields using rates generated by a numerical simulator and then applied to producing fields in Argentina, the North Sea, Texas, and Wyoming. Results indicate that the model coefficients (weights) between wells are consistent with the known geology and relative location between wells; they are independent of injection/production rates. The developed procedure provides parameters (time constants) that explicitly indicate the attenuation and time lag between injector and producer pairs. The new procedure allows for a better insight into the well-to-well connectivities for the fields than MLR. Complex geological conditions are often not easily identified using the weights and time constants values individually. However, combining both sets of parameters in certain representations enhances the inference about the geological features. The applications of the new representations to numerically simulated fields and then to real fields indicate that these representations are capable of identifying whether the connectivity of an injector-producer well pair is through fractures, a high-permeability layer, or through partially completed wells. The technique may produce negative weights for some well pairs. Because there is no physical explanation in waterfloods for negative weights, these are also investigated. The negative weights have at least three causes

  9. Techniques for measurement of the thermal expansion of advanced composite materials

    NASA Technical Reports Server (NTRS)

    Tompkins, Stephen S.

    1989-01-01

    Techniques available to measure small thermal displacements in flat laminates and structural tubular elements of advanced composite materials are described. Emphasis is placed on laser interferometry and the laser interferometric dilatometer system used at the National Aeronautics and Space Administration (NASA) Langley Research Center. Thermal expansion data are presented for graphite-fiber reinforced 6061 and 2024 aluminum laminates and for graphite fiber reinforced AZ91 C and QH21 A magnesium laminates before and after processing to minimize or eliminate thermal strain hysteresis. Data are also presented on the effects of reinforcement volume content on thermal expansion of silicon-carbide whisker and particulate reinforced aluminum.

  10. Advanced techniques in IR thermography as a tool for the pest management professional

    NASA Astrophysics Data System (ADS)

    Grossman, Jon L.

    2006-04-01

    Within the past five years, the Pest Management industry has become aware that IR thermography can aid in the detection of pest infestations and locate other conditions that are within the purview of the industry. This paper will review the applications that can be utilized by the pest management professional and discuss the advanced techniques that may be required in conjunction with thermal imaging to locate insect and other pest infestations, moisture within structures, the verification of data and the special challenges associated with the inspection process.

  11. Reliable Welding of HSLA Steels by Square Wave Pulsing Using an Advanced Sensing (EDAP) Technique.

    DTIC Science & Technology

    1986-04-30

    situation is the result of welding on A710 steel . (A similar effect on welding on HY80 ?) The following is offered by Woods and Milner (Ref. 12): "The...AD-R69 762 RELIABLE MELDING OF HSLA STEELS BY SQUARE MAVE PULSING 1/2 USING AN ADV NCED.. (U) APPLIED FUSION TECHNOLOGIES INC FORT COLLINS CO C...6 p . 0 Report 0001 AZ AD-A 168 762 I "RELIABLE WELDING OF HSLA STEELS BY SQUARE WAVE PULSING USING AN ADVANCED SENSING (EDAP) TECHNIQUE- Preliminary

  12. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    PubMed Central

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance. PMID:26346869

  13. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    PubMed

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  14. Statistical signal processing technique for identification of different infected sites of the diseased lungs.

    PubMed

    Abbas, Ali

    2012-06-01

    Accurate Diagnosis of lung disease depends on understanding the sounds emanating from lung and its location. Lung sounds are of significance as they supply precise and important information on the health of the respiratory system. In addition, correct interpretation of breath sounds depends on a systematic approach to auscultation; it also requires the ability to describe the location of abnormal finding in relation to bony structures and anatomic landmark lines. Lungs consist of number of lobes; each lung lobe is further subdivided into smaller segments. These segments are attached to each other. Knowledge of the position of the lung segments is useful and important during the auscultation and diagnosis of the lung diseases. Usually the medical doctors give the location of the infection a segmental position reference. Breath sounds are auscultated over the anterior chest wall surface, the lateral chest wall surfaces, and posterior chest wall surface. Adventitious sounds from different location can be detected. It is common to seek confirmation of the sound detection and its location using invasive and potentially harmful imaging diagnosis techniques like x-rays. To overcome this limitation and for fast, reliable, accurate, and inexpensive diagnose a technique is developed in this research for identifying the location of infection through a computerized auscultation system.

  15. Advancing IM-CW Lidar Modulation Techniques for ASCENDS CO2 Column Measurements from Space

    NASA Astrophysics Data System (ADS)

    Campbell, J. F.; Lin, B.; Nehrir, A. R.; Harrison, F. W.; Chen, S.; Obland, M. D.

    2013-12-01

    Global atmospheric carbon dioxide (CO2) measurements through the Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) decadal survey recommended space mission are critical for improving our understanding of CO2 sources and sinks. IM-CW (Intensity Modulated Continuous Wave) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS science requirements. In previous laboratory and flight experiments we have successfully used linear swept frequency modulation to discriminate surface lidar returns from intermediate aerosol and cloud contamination. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate clouds, which is a requirement for the inversion of the CO2 column mixing ratio from the instrument optical depth measurements, has been demonstrated with the linear swept frequency modulation technique. We are concurrently investigating advanced techniques to help improve the auto-correlation properties of the transmitted waveform implemented through physical hardware to make cloud rejection more robust in special restricted scenarios. Several different modulation techniques are compared including orthogonal linear swept, orthogonal non-linear swept, time shifted PN, sine wave modulated PN, and sine wave pulsed PN. Different PN code techniques are presented that are appropriate for different types of lidar hardware, including our current ASCENDS IM-CW concept space hardware. These techniques have excellent auto-correlation properties without sidelobes while possessing a finite bandwidth (by way of a new cyclic digital filter), which will reduce bias error in the presence of multiple scatterers. Our analyses show that the studied modulation techniques can increase the accuracy of CO2 column measurements from space.

  16. Arthroscopically assisted Sauvé-Kapandji procedure: an advanced technique for distal radioulnar joint arthritis.

    PubMed

    Luchetti, Riccardo; Khanchandani, Prakash; Da Rin, Ferdinando; Borelli, Pierpaolo P; Mathoulin, Christophe; Atzei, Andrea

    2008-12-01

    Osteoarthritis of distal radioulnar joint (DRUJ) leads to chronic wrist pain, weakness of grip strength, and limitation of motion, all of which affect the quality of life of the patient. Over the years, several procedures have been used for the treatment of this condition; however, this condition still remains a therapeutic challenge for the hand surgeons. Many procedures such as Darrach procedure, Bower procedure, Sauvé-Kapandji procedure, and ulnar head replacement have been used. Despite many advances in wrist arthroscopy, arthroscopy has not been used for the treatment of arthritis of the DRUJ. We describe a novel technique of arthroscopically assisted Sauvé-Kapandji procedure for the arthritis of the DRUJ. The advantages of this technique are its less invasive nature, preservation of the extensor retinaculum, more anatomical position of the DRUJ, faster rehabilitation, and a better cosmesis.

  17. The search for neuroimaging biomarkers of Alzheimer's disease with advanced MRI techniques.

    PubMed

    Li, Tie-Qiang; Wahlund, Lars-Olof

    2011-03-01

    The aim of this review is to examine the recent literature on using advanced magnetic resonance imaging (MRI) techniques for finding neuroimaging biomarkers that are sensitive to the detection of risks for Alzheimer's disease (AD). Since structural MRI techniques, such as brain structural volumetry and voxel-based morphometry (VBM), have been widely used for AD studies and extensively reviewed, we will only briefly touch on the topics of volumetry and morphometry. The focus of the current review is about the more recent developments in the search for AD neuroimaging biomarkers with functional MRI (fMRI), resting-state functional connectivity MRI (fcMRI), diffusion tensor imaging (DTI), arterial spin-labeling (ASL), and magnetic resonance spectroscopy (MRS).

  18. Benign Spine Lesions: Advances in Techniques for Minimally Invasive Percutaneous Treatment.

    PubMed

    Tomasian, A; Wallace, A N; Jennings, J W

    2017-02-09

    Minimally invasive percutaneous imaging-guided techniques have been shown to be safe and effective for the treatment of benign tumors of the spine. Techniques available include a variety of tumor ablation technologies, including radiofrequency ablation, cryoablation, microwave ablation, alcohol ablation, and laser photocoagulation. Vertebral augmentation may be performed after ablation as part of the same procedure for fracture stabilization or prevention. Typically, the treatment goal in benign spine lesions is definitive cure. Painful benign spine lesions commonly encountered in daily practice include osteoid osteoma, osteoblastoma, vertebral hemangioma, aneurysmal bone cyst, Paget disease, and subacute/chronic Schmorl node. This review discusses the most recent advancement and use of minimally invasive percutaneous therapeutic options for the management of benign spine lesions.

  19. Quality control of herbal medicines by using spectroscopic techniques and multivariate statistical analysis.

    PubMed

    Singh, Sunil Kumar; Jha, Sunil Kumar; Chaudhary, Anand; Yadava, R D S; Rai, S B

    2010-02-01

    Herbal medicines play an important role in modern human life and have significant effects on treating diseases; however, the quality and safety of these herbal products has now become a serious issue due to increasing pollution in air, water, soil, etc. The present study proposes Fourier transform infrared spectroscopy (FTIR) along with the statistical method principal component analysis (PCA) to identify and discriminate herbal medicines for quality control. Herbal plants have been characterized using FTIR spectroscopy. Characteristic peaks (strong and weak) have been marked for each herbal sample in the fingerprint region (400-2000 cm(-1)). The ratio of the areas of any two marked characteristic peaks was found to be nearly consistent for the same plant from different regions, and thus the present idea suggests an additional discrimination method for herbal medicines. PCA clusters herbal medicines into different groups, clearly showing that this method can adequately discriminate different herbal medicines using FTIR data. Toxic metal contents (Cd, Pb, Cr, and As) have been determined and the results compared with the higher permissible daily intake limit of heavy metals proposed by the World Health Organization (WHO).

  20. Source and distribution of metals in urban soil of Bombay, India, using multivariate statistical techniques

    NASA Astrophysics Data System (ADS)

    Ratha, D. S.; Sahu, B. K.

    1993-11-01

    Simplification of a complex system of geochemical variables obtained from the soils of an industrialized area of Bombay is attempted by means of R-mode factor analysis. Prior to factor analysis, discriminant analysis was carried out taking rock and soil chemical data to establish the anthropogenic contribution of metals in soil. Trace elements (Cd, Co, Cr, Cu, Fe, Mn, Ni, Pb, and Zn) are expressed in terms of three rotated factors. The factors mostly indicate anthropogenic sources of metals such as atmospheric fallout, emission from different industrial chimneys, crushing operations in quarries, and sewage sludges. Major elements (Na, Mg, Al, Si, P, K, Ca, Ti, Mn, and Fe) are also expressed in terms of three rotated factors indicating natural processes such as chemical weathering, presence of clay minerals, and contribution from sewage sludges and municipal refuse. Summary statistics (mean, standard deviation, skewness, and kurtosis) for the particle size distribution were interpreted as moderate dominance of fine particles. Mineralogical studies revealed the presence of montmorillonite, kaolinite, and illite types of clay minerals. Thus the present study provides information about the metal content entering into the soil and their level, sources, and distribution in the area.

  1. Quantitative evaluation of ASiR image quality: an adaptive statistical iterative reconstruction technique

    NASA Astrophysics Data System (ADS)

    Van de Casteele, Elke; Parizel, Paul; Sijbers, Jan

    2012-03-01

    Adaptive statistical iterative reconstruction (ASiR) is a new reconstruction algorithm used in the field of medical X-ray imaging. This new reconstruction method combines the idealized system representation, as we know it from the standard Filtered Back Projection (FBP) algorithm, and the strength of iterative reconstruction by including a noise model in the reconstruction scheme. It studies how noise propagates through the reconstruction steps, feeds this model back into the loop and iteratively reduces noise in the reconstructed image without affecting spatial resolution. In this paper the effect of ASiR on the contrast to noise ratio is studied using the low contrast module of the Catphan phantom. The experiments were done on a GE LightSpeed VCT system at different voltages and currents. The results show reduced noise and increased contrast for the ASiR reconstructions compared to the standard FBP method. For the same contrast to noise ratio the images from ASiR can be obtained using 60% less current, leading to a reduction in dose of the same amount.

  2. Recent Advances and New Techniques in Visualization of Ultra-short Relativistic Electron Bunches

    SciTech Connect

    Xiang, Dao; /SLAC

    2012-06-05

    Ultrashort electron bunches with rms length of {approx} 1 femtosecond (fs) can be used to generate ultrashort x-ray pulses in FELs that may open up many new regimes in ultrafast sciences. It is also envisioned that ultrashort electron bunches may excite {approx}TeV/m wake fields for plasma wake field acceleration and high field physics studies. Recent success of using 20 pC electron beam to drive an x-ray FEL at LCLS has stimulated world-wide interests in using low charge beam (1 {approx} 20 pC) to generate ultrashort x-ray pulses (0.1 fs {approx} 10 fs) in FELs. Accurate measurement of the length (preferably the temporal profile) of the ultrashort electron bunch is essential for understanding the physics associated with the bunch compression and transportation. However, the shorter and shorter electron bunch greatly challenges the present beam diagnostic methods. In this paper we review the recent advances in the measurement of ultra-short electron bunches. We will focus on several techniques and their variants that provide the state-of-the-art temporal resolution. Methods to further improve the resolution of these techniques and the promise to break the 1 fs time barrier is discussed. We review recent advances in the measurement of ultrashort relativistic electron bunches. We will focus on several techniques and their variants that are capable of breaking the femtosecond time barrier in measurements of ultrashort bunches. Techniques for measuring beam longitudinal phase space as well as the x-ray pulse shape in an x-ray FEL are also discussed.

  3. Individual Particle Analysis of Ambient PM 2.5 Using Advanced Electron Microscopy Techniques

    SciTech Connect

    Gerald J. Keeler; Masako Morishita

    2006-12-31

    The overall goal of this project was to demonstrate a combination of advanced electron microscopy techniques that can be effectively used to identify and characterize individual particles and their sources. Specific techniques to be used include high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM), STEM energy dispersive X-ray spectrometry (EDX), and energy-filtered TEM (EFTEM). A series of ambient PM{sub 2.5} samples were collected in communities in southwestern Detroit, MI (close to multiple combustion sources) and Steubenville, OH (close to several coal fired utility boilers). High-resolution TEM (HRTEM) -imaging showed a series of nano-metal particles including transition metals and elemental composition of individual particles in detail. Submicron and nano-particles with Al, Fe, Ti, Ca, U, V, Cr, Si, Ba, Mn, Ni, K and S were observed and characterized from the samples. Among the identified nano-particles, combinations of Al, Fe, Si, Ca and Ti nano-particles embedded in carbonaceous particles were observed most frequently. These particles showed very similar characteristics of ultrafine coal fly ash particles that were previously reported. By utilizing HAADF-STEM, STEM-EDX, and EF-TEM, this investigation was able to gain information on the size, morphology, structure, and elemental composition of individual nano-particles collected in Detroit and Steubenville. The results showed that the contributions of local combustion sources - including coal fired utilities - to ultrafine particle levels were significant. Although this combination of advanced electron microscopy techniques by itself can not identify source categories, these techniques can be utilized as complementary analytical tools that are capable of providing detailed information on individual particles.

  4. STEAM - Statistical Template Estimation for Abnormality Mapping: A personalized DTI analysis technique with applications to the screening of preterm infants.

    PubMed

    Booth, Brian G; Miller, Steven P; Brown, Colin J; Poskitt, Kenneth J; Chau, Vann; Grunau, Ruth E; Synnes, Anne R; Hamarneh, Ghassan

    2016-01-15

    We introduce the STEAM DTI analysis engine: a whole brain voxel-based analysis technique for the examination of diffusion tensor images (DTIs). Our STEAM analysis technique consists of two parts. First, we introduce a collection of statistical templates that represent the distribution of DTIs for a normative population. These templates include various diffusion measures from the full tensor, to fractional anisotropy, to 12 other tensor features. Second, we propose a voxel-based analysis (VBA) pipeline that is reliable enough to identify areas in individual DTI scans that differ significantly from the normative group represented in the STEAM statistical templates. We identify and justify choices in the VBA pipeline relating to multiple comparison correction, image smoothing, and dealing with non-normally distributed data. Finally, we provide a proof of concept for the utility of STEAM on a cohort of 134 very preterm infants. We generated templates from scans of 55 very preterm infants whose T1 MRI scans show no abnormalities and who have normal neurodevelopmental outcome. The remaining 79 infants were then compared to the templates using our VBA technique. We show: (a) that our statistical templates display the white matter development expected over the modeled time period, and (b) that our VBA results detect abnormalities in the diffusion measurements that relate significantly with both the presence of white matter lesions and with neurodevelopmental outcomes at 18months. Most notably, we show that STEAM produces personalized results while also being able to highlight abnormalities across the whole brain and at the scale of individual voxels. While we show the value of STEAM on DTI scans from a preterm infant cohort, STEAM can be equally applied to other cohorts as well. To facilitate this whole-brain personalized DTI analysis, we made STEAM publicly available at http://www.sfu.ca/bgb2/steam.

  5. A data-derived forecast model of surface circulation based on statistical forcing-response decomposition techniques

    NASA Astrophysics Data System (ADS)

    Kim, Sung Yong

    2016-04-01

    This paper presents a data-derived surface current forecast model based on statistical decomposition techniques [Kim et al 2010] on the observations of high-frequency radar-derived surface currents, local winds, and sea surface height anomalies (SSHA) off southern San Diego. The regional surface circulation mainly consists of tide-, wind-, and low-frequency pressure gradient-coherent components, which leads us to use tidal harmonic analysis, response functions using wind stress and pressure gradients, autoregressive analysis for residual components in the forecast model. These basis functions have been consecutively added, and the performance of corresponding forecast models is evaluated.

  6. Quantifying heterogeneous responses of fish community size structure using novel combined statistical techniques.

    PubMed

    Marshall, Abigail M; Bigg, Grant R; van Leeuwen, Sonja M; Pinnegar, John K; Wei, Hua-Liang; Webb, Thomas J; Blanchard, Julia L

    2016-05-01

    To understand changes in ecosystems, the appropriate scale at which to study them must be determined. Large marine ecosystems (LMEs) cover thousands of square kilometres and are a useful classification scheme for ecosystem monitoring and assessment. However, averaging across LMEs may obscure intricate dynamics within. The purpose of this study is to mathematically determine local and regional patterns of ecological change within an LME using empirical orthogonal functions (EOFs). After using EOFs to define regions with distinct patterns of change, a statistical model originating from control theory is applied (Nonlinear AutoRegressive Moving Average with eXogenous input - NARMAX) to assess potential drivers of change within these regions. We have selected spatial data sets (0.5° latitude × 1°longitude) of fish abundance from North Sea fisheries research surveys (spanning 1980-2008) as well as of temperature, oxygen, net primary production and a fishing pressure proxy, to which we apply the EOF and NARMAX methods. Two regions showed significant changes since 1980: the central North Sea displayed a decrease in community size structure which the NARMAX model suggested was linked to changes in fishing; and the Norwegian trench region displayed an increase in community size structure which, as indicated by NARMAX results, was primarily linked to changes in sea-bottom temperature. These regions were compared to an area of no change along the eastern Scottish coast where the model determined the community size structure was most strongly associated to net primary production. This study highlights the multifaceted effects of environmental change and fishing pressures in different regions of the North Sea. Furthermore, by highlighting this spatial heterogeneity in community size structure change, important local spatial dynamics are often overlooked when the North Sea is considered as a broad-scale, homogeneous ecosystem (as normally is the case within the political

  7. Where in the Cell Are You? Probing HIV-1 Host Interactions through Advanced Imaging Techniques

    PubMed Central

    Dirk, Brennan S.; Van Nynatten, Logan R.; Dikeakos, Jimmy D.

    2016-01-01

    Viruses must continuously evolve to hijack the host cell machinery in order to successfully replicate and orchestrate key interactions that support their persistence. The type-1 human immunodeficiency virus (HIV-1) is a prime example of viral persistence within the host, having plagued the human population for decades. In recent years, advances in cellular imaging and molecular biology have aided the elucidation of key steps mediating the HIV-1 lifecycle and viral pathogenesis. Super-resolution imaging techniques such as stimulated emission depletion (STED) and photoactivation and localization microscopy (PALM) have been instrumental in studying viral assembly and release through both cell–cell transmission and cell–free viral transmission. Moreover, powerful methods such as Forster resonance energy transfer (FRET) and bimolecular fluorescence complementation (BiFC) have shed light on the protein-protein interactions HIV-1 engages within the host to hijack the cellular machinery. Specific advancements in live cell imaging in combination with the use of multicolor viral particles have become indispensable to unravelling the dynamic nature of these virus-host interactions. In the current review, we outline novel imaging methods that have been used to study the HIV-1 lifecycle and highlight advancements in the cell culture models developed to enhance our understanding of the HIV-1 lifecycle. PMID:27775563

  8. Where in the Cell Are You? Probing HIV-1 Host Interactions through Advanced Imaging Techniques.

    PubMed

    Dirk, Brennan S; Van Nynatten, Logan R; Dikeakos, Jimmy D

    2016-10-19

    Viruses must continuously evolve to hijack the host cell machinery in order to successfully replicate and orchestrate key interactions that support their persistence. The type-1 human immunodeficiency virus (HIV-1) is a prime example of viral persistence within the host, having plagued the human population for decades. In recent years, advances in cellular imaging and molecular biology have aided the elucidation of key steps mediating the HIV-1 lifecycle and viral pathogenesis. Super-resolution imaging techniques such as stimulated emission depletion (STED) and photoactivation and localization microscopy (PALM) have been instrumental in studying viral assembly and release through both cell-cell transmission and cell-free viral transmission. Moreover, powerful methods such as Forster resonance energy transfer (FRET) and bimolecular fluorescence complementation (BiFC) have shed light on the protein-protein interactions HIV-1 engages within the host to hijack the cellular machinery. Specific advancements in live cell imaging in combination with the use of multicolor viral particles have become indispensable to unravelling the dynamic nature of these virus-host interactions. In the current review, we outline novel imaging methods that have been used to study the HIV-1 lifecycle and highlight advancements in the cell culture models developed to enhance our understanding of the HIV-1 lifecycle.

  9. Applications of Advanced, Waveform Based AE Techniques for Testing Composite Materials

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been previously used to evaluate damage progression in laboratory tests of composite coupons. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite structures, the effects of wave propagation over larger distances and through structural complexities must be well characterized and understood. In this research, measurements were made of the attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels. As these materials have applications in a cryogenic environment, the effects of cryogenic insulation on the attenuation of plate mode AE signals were also documented.

  10. Biotechnology apprenticeship for secondary-level students: teaching advanced cell culture techniques for research.

    PubMed

    Lewis, Jennifer R; Kotur, Mark S; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A; Ferrell, Nick; Sullivan, Kathryn D; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss small-group apprenticeships (SGAs) as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments using both flow cytometry and laser scanning cytometry during the 1-month summer apprenticeship. In addition to effectively and efficiently teaching cell biology laboratory techniques, this course design provided an opportunity for research training, career exploration, and mentoring. Students participated in active research projects, working with a skilled interdisciplinary team of researchers in a large research institution with access to state-of-the-art instrumentation. The instructors, composed of graduate students, laboratory managers, and principal investigators, worked well together to present a real and worthwhile research experience. The students enjoyed learning cell culture techniques while contributing to active research projects. The institution's researchers were equally enthusiastic to instruct and serve as mentors. In this article, we clarify and illuminate the value of small-group laboratory apprenticeships to the institution and the students by presenting the results and experiences of seven middle and high school participants and their instructors.

  11. Advanced techniques for array processing. Final report, 1 Mar 89-30 Apr 91

    SciTech Connect

    Friedlander, B.

    1991-05-30

    Array processing technology is expected to be a key element in communication systems designed for the crowded and hostile environment of the future battlefield. While advanced array processing techniques have been under development for some time, their practical use has been very limited. This project addressed some of the issues which need to be resolved for a successful transition of these promising techniques from theory into practice. The main problem which was studied was that of finding the directions of multiple co-channel transmitters from measurements collected by an antenna array. Two key issues related to high-resolution direction finding were addressed: effects of system calibration errors, and effects of correlation between the received signals due to multipath propagation. A number of useful theoretical performance analysis results were derived, and computationally efficient direction estimation algorithms were developed. These results include: self-calibration techniques for antenna arrays, sensitivity analysis for high-resolution direction finding, extensions of the root-MUSIC algorithm to arbitrary arrays and to arrays with polarization diversity, and new techniques for direction finding in the presence of multipath based on array interpolation. (Author)

  12. Advanced condition monitoring techniques and plant life extension studies at EBR-2

    SciTech Connect

    Singer, R.M.; Gross, K.C. ); Perry, W.H.; King, R.W. )

    1991-01-01

    Numerous advanced techniques have been evaluated and tested at EBR-2 as part of a plant-life extension program for detection of degradation and other abnormalities in plant systems. Two techniques have been determined to be of considerable assistance in planning for the extended-life operation of EBR-2. The first, a computer-based pattern-recognition system (System State Analyzer or SSA) is used for surveillance of the primary system instrumentation, primary sodium pumps and plant heat balances. This surveillance has indicated that the SSA can detect instrumentation degradation and system performance degradation over varying time intervals and can be used to provide derived signal values to replace signals from failed sensors. The second technique, also a computer-based pattern-recognition system (Sequential Probability Ratio Test or SPRT) is used to validate signals and to detect incipient failures in sensors and components or systems. It is being used on the failed fuel detection system and is experimentally used on the primary coolant pumps. Both techniques are described and experience with their operation presented.

  13. Advancement of an Infra-Red Technique for Whole-Field Concentration Measurements in Fluidized Beds

    PubMed Central

    Medrano, Jose A.; de Nooijer, Niek C. A.; Gallucci, Fausto; van Sint Annaland, Martin

    2016-01-01

    For a better understanding and description of the mass transport phenomena in dense multiphase gas-solids systems such as fluidized bed reactors, detailed and quantitative experimental data on the concentration profiles is required, which demands advanced non-invasive concentration monitoring techniques with a high spatial and temporal resolution. A novel technique based on the selective detection of a gas component in a gas mixture using infra-red properties has been further developed. The first stage development was carried out using a very small sapphire reactor and CO2 as tracer gas. Although the measuring principle was demonstrated, the real application was hindered by the small reactor dimensions related to the high costs and difficult handling of large sapphire plates. In this study, a new system has been developed, that allows working at much larger scales and yet with higher resolution. In the new system, propane is used as tracer gas and quartz as reactor material. In this study, a thorough optimization and calibration of the technique is presented which is subsequently applied for whole-field measurements with high temporal resolution. The developed technique allows the use of a relatively inexpensive configuration for the measurement of detailed concentration fields and can be applied to a large variety of important chemical engineering topics. PMID:26927127

  14. Biotechnology Apprenticeship for Secondary-Level Students: Teaching Advanced Cell Culture Techniques for Research

    PubMed Central

    Lewis, Jennifer R.; Kotur, Mark S.; Butt, Omar; Kulcarni, Sumant; Riley, Alyssa A.; Ferrell, Nick; Sullivan, Kathryn D.; Ferrari, Mauro

    2002-01-01

    The purpose of this article is to discuss small-group apprenticeships (SGAs) as a method to instruct cell culture techniques to high school participants. The study aimed to teach cell culture practices and to introduce advanced imaging techniques to solve various biomedical engineering problems. Participants designed and completed experiments using both flow cytometry and laser scanning cytometry during the 1-month summer apprenticeship. In addition to effectively and efficiently teaching cell biology laboratory techniques, this course design provided an opportunity for research training, career exploration, and mentoring. Students participated in active research projects, working with a skilled interdisciplinary team of researchers in a large research institution with access to state-of-the-art instrumentation. The instructors, composed of graduate students, laboratory managers, and principal investigators, worked well together to present a real and worthwhile research experience. The students enjoyed learning cell culture techniques while contributing to active research projects. The institution's researchers were equally enthusiastic to instruct and serve as mentors. In this article, we clarify and illuminate the value of small-group laboratory apprenticeships to the institution and the students by presenting the results and experiences of seven middle and high school participants and their instructors. PMID:12587031

  15. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    SciTech Connect

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  16. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  17. Use of Advanced Magnetic Resonance Imaging Techniques in Neuromyelitis Optica Spectrum Disorder.

    PubMed

    Kremer, Stephane; Renard, Felix; Achard, Sophie; Lana-Peixoto, Marco A; Palace, Jacqueline; Asgari, Nasrin; Klawiter, Eric C; Tenembaum, Silvia N; Banwell, Brenda; Greenberg, Benjamin M; Bennett, Jeffrey L; Levy, Michael; Villoslada, Pablo; Saiz, Albert; Fujihara, Kazuo; Chan, Koon Ho; Schippling, Sven; Paul, Friedemann; Kim, Ho Jin; de Seze, Jerome; Wuerfel, Jens T; Cabre, Philippe; Marignier, Romain; Tedder, Thomas; van Pelt, Danielle; Broadley, Simon; Chitnis, Tanuja; Wingerchuk, Dean; Pandit, Lekha; Leite, Maria Isabel; Apiwattanakul, Metha; Kleiter, Ingo; Prayoonwiwat, Naraporn; Han, May; Hellwig, Kerstin; van Herle, Katja; John, Gareth; Hooper, D Craig; Nakashima, Ichiro; Sato, Douglas; Yeaman, Michael R; Waubant, Emmanuelle; Zamvil, Scott; Stüve, Olaf; Aktas, Orhan; Smith, Terry J; Jacob, Anu; O'Connor, Kevin

    2015-07-01

    Brain parenchymal lesions are frequently observed on conventional magnetic resonance imaging (MRI) scans of patients with neuromyelitis optica (NMO) spectrum disorder, but the specific morphological and temporal patterns distinguishing them unequivocally from lesions caused by other disorders have not been identified. This literature review summarizes the literature on advanced quantitative imaging measures reported for patients with NMO spectrum disorder, including proton MR spectroscopy, diffusion tensor imaging, magnetization transfer imaging, quantitative MR volumetry, and ultrahigh-field strength MRI. It was undertaken to consider the advanced MRI techniques used for patients with NMO by different specialists in the field. Although quantitative measures such as proton MR spectroscopy or magnetization transfer imaging have not reproducibly revealed diffuse brain injury, preliminary data from diffusion-weighted imaging and brain tissue volumetry indicate greater white matter than gray matter degradation. These findings could be confirmed by ultrahigh-field MRI. The use of nonconventional MRI techniques may further our understanding of the pathogenic processes in NMO spectrum disorders and may help us identify the distinct radiographic features corresponding to specific phenotypic manifestations of this disease.

  18. Advanced grazing-incidence techniques for modern soft-matter materials analysis.

    PubMed

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  19. Advanced MRI Techniques in the Evaluation of Complex Cystic Breast Lesions

    PubMed Central

    Popli, Manju Bala; Gupta, Pranav; Arse, Devraj; Kumar, Pawan; Kaur, Prabhjot

    2016-01-01

    OBJECTIVE The purpose of this research work was to evaluate complex cystic breast lesions by advanced MRI techniques and correlating imaging with histologic findings. METHODS AND MATERIALS In a cross-sectional design from September 2013 to August 2015, 50 patients having sonographically detected complex cystic lesions of the breast were included in the study. Morphological characteristics were assessed. Dynamic contrast-enhanced MRI along with diffusion-weighted imaging and MR spectroscopy were used to further classify lesions into benign and malignant categories. All the findings were correlated with histopathology. RESULTS Of the 50 complex cystic lesions, 32 proved to be benign and 18 were malignant on histopathology. MRI features of heterogeneous enhancement on CE-MRI (13/18), Type III kinetic curve (13/18), reduced apparent diffusion coefficient (18/18), and tall choline peak (17/18) were strong predictors of malignancy. Thirteen of the 18 lesions showed a combination of Type III curve, reduced apparent diffusion coefficient value, and tall choline peak. CONCLUSIONS Advanced MRI techniques like dynamic imaging, diffusion-weighted sequences, and MR spectroscopy provide a high level of diagnostic confidence in the characterization of complex cystic breast lesion, thus allowing early diagnosis and significantly reducing patient morbidity and mortality. From our study, lesions showing heterogeneous contrast enhancement, Type III kinetic curve, diffusion restriction, and tall choline peak were significantly associated with malignant complex cystic lesions of the breast. PMID:27330299

  20. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  1. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for ASCENDS O2 Column Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. Wallace; Obland, Michael D.; Meadows, Byron

    2015-01-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  2. Advanced Intensity-Modulation Continuous-Wave Lidar Techniques for Column CO2 Measurements

    NASA Astrophysics Data System (ADS)

    Campbell, J. F.; Lin, B.; Nehrir, A. R.; Obland, M. D.; Liu, Z.; Browell, E. V.; Chen, S.; Kooi, S. A.; Fan, T. F.

    2015-12-01

    Global and regional atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission and Atmospheric Carbon and Transport (ACT) - America airborne investigation are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity-Modulated Continuous-Wave (IM-CW) lidar techniques are being investigated as a means of facilitating CO2 measurements from space and airborne platforms to meet the mission science measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud returns. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of intervening optically thin clouds, thereby minimizing bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the Earth's surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques and provides very high (at sub-meter level) range resolution. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These techniques are used in a new data processing architecture to support the ASCENDS CarbonHawk Experiment Simulator (ACES) and ACT-America programs.

  3. Advanced intensity-modulation continuous-wave lidar techniques for ASCENDS CO2 column measurements

    NASA Astrophysics Data System (ADS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.; Harrison, F. W.; Obland, Michael D.; Meadows, Byron

    2015-10-01

    Global atmospheric carbon dioxide (CO2) measurements for the NASA Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) space mission are critical for improving our understanding of global CO2 sources and sinks. Advanced Intensity- Modulated Continuous-Wave (IM-CW) lidar techniques are investigated as a means of facilitating CO2 measurements from space to meet the ASCENDS measurement requirements. In recent numerical, laboratory and flight experiments we have successfully used the Binary Phase Shift Keying (BPSK) modulation technique to uniquely discriminate surface lidar returns from intermediate aerosol and cloud contamination. We demonstrate the utility of BPSK to eliminate sidelobes in the range profile as a means of making Integrated Path Differential Absorption (IPDA) column CO2 measurements in the presence of optically thin clouds, thereby eliminating the need to correct for sidelobe bias errors caused by the clouds. Furthermore, high accuracy and precision ranging to the surface as well as to the top of intermediate cloud layers, which is a requirement for the inversion of column CO2 number density measurements to column CO2 mixing ratios, has been demonstrated using new hyperfine interpolation techniques that takes advantage of the periodicity of the modulation waveforms. This approach works well for both BPSK and linear swept-frequency modulation techniques. The BPSK technique under investigation has excellent auto-correlation properties while possessing a finite bandwidth. A comparison of BPSK and linear swept-frequency is also discussed in this paper. These results are extended to include Richardson-Lucy deconvolution techniques to extend the resolution of the lidar beyond that implied by limit of the bandwidth of the modulation, where it is shown useful for making tree canopy measurements.

  4. System Design Techniques for Reducing the Power Requirements of Advanced life Support Systems

    NASA Technical Reports Server (NTRS)

    Finn, Cory; Levri, Julie; Pawlowski, Chris; Crawford, Sekou; Luna, Bernadette (Technical Monitor)

    2000-01-01

    The high power requirement associated with overall operation of regenerative life support systems is a critical Z:p technological challenge. Optimization of individual processors alone will not be sufficient to produce an optimized system. System studies must be used in order to improve the overall efficiency of life support systems. Current research efforts at NASA Ames Research Center are aimed at developing approaches for reducing system power and energy usage in advanced life support systems. System energy integration and energy reuse techniques are being applied to advanced life support, in addition to advanced control methods for efficient distribution of power and thermal resources. An overview of current results of this work will be presented. The development of integrated system designs that reuse waste heat from sources such as crop lighting and solid waste processing systems will reduce overall power and cooling requirements. Using an energy integration technique known as Pinch analysis, system heat exchange designs are being developed that match hot and cold streams according to specific design principles. For various designs, the potential savings for power, heating and cooling are being identified and quantified. The use of state-of-the-art control methods for distribution of resources, such as system cooling water or electrical power, will also reduce overall power and cooling requirements. Control algorithms are being developed which dynamically adjust the use of system resources by the various subsystems and components in order to achieve an overall goal, such as smoothing of power usage and/or heat rejection profiles, while maintaining adequate reserves of food, water, oxygen, and other consumables, and preventing excessive build-up of waste materials. Reductions in the peak loading of the power and thermal systems will lead to lower overall requirements. Computer simulation models are being used to test various control system designs.

  5. Use of advanced neuroimaging techniques in the evaluation of pediatric traumatic brain injury.

    PubMed

    Ashwal, Stephen; Holshouser, Barbara A; Tong, Karen A

    2006-01-01

    Advanced neuroimaging techniques are now used to expand our knowledge of traumatic brain injury, and increasingly, they are being applied to children. This review will examine four of these methods as they apply to children who present acutely after injury. (1) Susceptibility weighted imaging is a 3-dimensional high-resolution magnetic resonance imaging technique that is more sensitive than conventional imaging in detecting hemorrhagic lesions that are often associated with diffuse axonal injury. (2) Magnetic resonance spectroscopy acquires metabolite information reflecting neuronal integrity and function from multiple brain regions and provides sensitive, noninvasive assessment of neurochemical alterations that offers early prognostic information regarding the outcome. (3) Diffusion weighted imaging is based on differences in diffusion of water molecules within the brain and has been shown to be very sensitive in the early detection of ischemic injury. It is now being used to study the direct effects of traumatic injury as well as those due to secondary ischemia. (4) Diffusion tensor imaging is a form of diffusion weighted imaging and allows better evaluation of white matter fiber tracts by taking advantage of the intrinsic directionality (anisotropy) of water diffusion in human brain. It has been shown to be useful in identifying white matter abnormalities after diffuse axonal injury when conventional imaging appears normal. An important aspect of these advanced methods is that they demonstrate that 'normal-appearing' brain in many instances is not normal, i.e. there is evidence of significant undetected injury that may underlie a child's clinical status. Availability and integration of these advanced imaging methods will lead to better treatment and change the standard of care for use of neuroimaging to evaluate children with traumatic brain injury.

  6. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services.

  7. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program

  8. A statistical model-based technique for accounting for prostate gland deformation in endorectal coil-based MR imaging.

    PubMed

    Tahmasebi, Amir M; Sharifi, Reza; Agarwal, Harsh K; Turkbey, Baris; Bernardo, Marcelino; Choyke, Peter; Pinto, Peter; Wood, Bradford; Kruecker, Jochen

    2012-01-01

    In prostate brachytherapy procedures, combining high-resolution endorectal coil (ERC)-MRI with Computed Tomography (CT) images has shown to improve the diagnostic specificity for malignant tumors. Despite such advantage, there exists a major complication in fusion of the two imaging modalities due to the deformation of the prostate shape in ERC-MRI. Conventionally, nonlinear deformable registration techniques have been utilized to account for such deformation. In this work, we present a model-based technique for accounting for the deformation of the prostate gland in ERC-MR imaging, in which a unique deformation vector is estimated for every point within the prostate gland. Modes of deformation for every point in the prostate are statistically identified using a set of MR-based training set (with and without ERC-MRI). Deformation of the prostate from a deformed (ERC-MRI) to a non-deformed state in a different modality (CT) is then realized by first calculating partial deformation information for a limited number of points (such as surface points or anatomical landmarks) and then utilizing the calculated deformation from a subset of the points to determine the coefficient values for the modes of deformations provided by the statistical deformation model. Using a leave-one-out cross-validation, our results demonstrated a mean estimation error of 1mm for a MR-to-MR registration.

  9. Alternative calibration techniques for counteracting the matrix effects in GC-MS-SPE pesticide residue analysis - a statistical approach.

    PubMed

    Rimayi, Cornelius; Odusanya, David; Mtunzi, Fanyana; Tsoka, Shepherd

    2015-01-01

    This paper investigates the efficiency of application of four different multivariate calibration techniques, namely matrix-matched internal standard (MMIS), matrix-matched external standard (MMES), solvent-only internal standard (SOIS) and solvent-only external standard (SOES) on the detection and quantification of 20 organochlorine compounds from high, low and blank matrix water sample matrices by Gas Chromatography-Mass Spectrometry (GC-MS) coupled to solid phase extraction (SPE). Further statistical testing, using Statistical Package for the Social Science (SPSS) by applying MANOVA, T-tests and Levene's F tests indicates that matrix composition has a more significant effect on the efficiency of the analytical method than the calibration method of choice. Matrix effects are widely described as one of the major sources of errors in GC-MS multiresidue analysis. Descriptive and inferential statistics proved that the matrix-matched internal standard calibration was the best approach to use for samples of varying matrix composition as it produced the most precise average mean recovery of 87% across all matrices tested. The use of an internal standard calibration overall produced more precise total recoveries than external standard calibration, with mean values of 77% and 64% respectively. The internal standard calibration technique produced a particularly high overall standard deviation of 38% at 95% confidence level indicating that it is less robust than the external standard calibration method which had an overall standard error of 32% at 95% confidence level. Overall, the matrix-matched external standard calibration proved to be the best calibration approach for analysis of low matrix samples which consisted of the real sample matrix as it had the most precise recovery of 98% compared to other calibration approaches for the low-matrix samples.

  10. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    NASA Astrophysics Data System (ADS)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  11. Multi-Site and Multi-Variables Statistical Downscaling Technique in the Monsoon Dominated Region of Pakistan

    NASA Astrophysics Data System (ADS)

    Khan, Firdos; Pilz, Jürgen

    2016-04-01

    South Asia is under the severe impacts of changing climate and global warming. The last two decades showed that climate change or global warming is happening and the first decade of 21st century is considered as the warmest decade over Pakistan ever in history where temperature reached 53 0C in 2010. Consequently, the spatio-temporal distribution and intensity of precipitation is badly effected and causes floods, cyclones and hurricanes in the region which further have impacts on agriculture, water, health etc. To cope with the situation, it is important to conduct impact assessment studies and take adaptation and mitigation remedies. For impact assessment studies, we need climate variables at higher resolution. Downscaling techniques are used to produce climate variables at higher resolution; these techniques are broadly divided into two types, statistical downscaling and dynamical downscaling. The target location of this study is the monsoon dominated region of Pakistan. One reason for choosing this area is because the contribution of monsoon rains in this area is more than 80 % of the total rainfall. This study evaluates a statistical downscaling technique which can be then used for downscaling climatic variables. Two statistical techniques i.e. quantile regression and copula modeling are combined in order to produce realistic results for climate variables in the area under-study. To reduce the dimension of input data and deal with multicollinearity problems, empirical orthogonal functions will be used. Advantages of this new method are: (1) it is more robust to outliers as compared to ordinary least squares estimates and other estimation methods based on central tendency and dispersion measures; (2) it preserves the dependence among variables and among sites and (3) it can be used to combine different types of distributions. This is important in our case because we are dealing with climatic variables having different distributions over different meteorological

  12. Measurements of the subcriticality using advanced technique of shooting source during operation of NPP reactors

    NASA Astrophysics Data System (ADS)

    Lebedev, G. V.; Petrov, V. V.; Bobylyov, V. T.; Butov, R. I.; Zhukov, A. M.; Sladkov, A. A.

    2014-12-01

    According to the rules of nuclear safety, the measurements of the subcriticality of reactors should be carried out in the process of performing nuclear hazardous operations. An advanced technique of shooting source of neutrons is proposed to meet this requirement. As such a source, a pulsed neutron source (PNS) is used. In order to realize this technique, it is recommended to enable a PNS with a frequency of 1-20 Hz. The PNS is stopped after achieving a steady-state (on average) number of neutrons in the reactor volume. The change in the number of neutrons in the reactor volume is measured in time with an interval of discreteness of ˜0.1 s. The results of these measurements with the application of a system of point-kinetics equations are used in order to calculate the sought subcriticality. The basic idea of the proposed technique used to measure the subcriticality is elaborated in a series of experiments on the Kvant assembly. The conditions which should be implemented in order to obtain a positive result of measurements are formulated. A block diagram of the basic version of the experimental setup is presented, whose main element is a pulsed neutron generator.

  13. New advanced surface modification technique: titanium oxide ceramic surface implants: long-term clinical results

    NASA Astrophysics Data System (ADS)

    Szabo, Gyorgy; Kovacs, Lajos; Barabas, Jozsef; Nemeth, Zsolt; Maironna, Carlo

    2001-11-01

    The purpose of this paper is to discuss the background to advanced surface modification technologies and to present a new technique, involving the formation of a titanium oxide ceramic coating, with relatively long-term results of its clinical utilization. Three general techniques are used to modify surfaces: the addition or removal of material and the change of material already present. Surface properties can also be changed without the addition or removal of material, through the laser or electron beam thermal treatment. The new technique outlined in this paper relates to the production of a corrosion-resistant 2000-2500 A thick, ceramic oxide layer with a coherent crystalline structure on the surface of titanium implants. The layer is grown electrochemically from the bulk of the metal and is modified by heat treatment. Such oxide ceramic-coated implants have a number of advantageous properties relative to implants covered with various other coatings: a higher external hardness, a greater force of adherence between the titanium and the oxide ceramic coating, a virtually perfect insulation between the organism and the metal (no possibility of metal allergy), etc. The coated implants were subjected to various physical, chemical, electronmicroscopic, etc. tests for a qualitative characterization. Finally, these implants (plates, screws for maxillofacial osteosynthesis and dental root implants) were applied in surgical practice for a period of 10 years. Tests and the experience acquired demonstrated the good properties of the titanium oxide ceramic-coated implants.

  14. Utilization of advanced calibration techniques in stochastic rock fall analysis of quarry slopes

    NASA Astrophysics Data System (ADS)

    Preh, Alexander; Ahmadabadi, Morteza; Kolenprat, Bernd

    2016-04-01

    In order to study rock fall dynamics, a research project was conducted by the Vienna University of Technology and the Austrian Central Labour Inspectorate (Federal Ministry of Labour, Social Affairs and Consumer Protection). A part of this project included 277 full-scale drop tests at three different quarries in Austria and recording key parameters of the rock fall trajectories. The tests involved a total of 277 boulders ranging from 0.18 to 1.8 m in diameter and from 0.009 to 8.1 Mg in mass. The geology of these sites included strong rock belonging to igneous, metamorphic and volcanic types. In this paper the results of the tests are used for calibration and validation a new stochastic computer model. It is demonstrated that the error of the model (i.e. the difference between observed and simulated results) has a lognormal distribution. Selecting two parameters, advanced calibration techniques including Markov Chain Monte Carlo Technique, Maximum Likelihood and Root Mean Square Error (RMSE) are utilized to minimize the error. Validation of the model based on the cross validation technique reveals that in general, reasonable stochastic approximations of the rock fall trajectories are obtained in all dimensions, including runout, bounce heights and velocities. The approximations are compared to the measured data in terms of median, 95% and maximum values. The results of the comparisons indicate that approximate first-order predictions, using a single set of input parameters, are possible and can be used to aid practical hazard and risk assessment.

  15. Measurements of the subcriticality using advanced technique of shooting source during operation of NPP reactors

    SciTech Connect

    Lebedev, G. V. Petrov, V. V.; Bobylyov, V. T.; Butov, R. I.; Zhukov, A. M.; Sladkov, A. A.

    2014-12-15

    According to the rules of nuclear safety, the measurements of the subcriticality of reactors should be carried out in the process of performing nuclear hazardous operations. An advanced technique of shooting source of neutrons is proposed to meet this requirement. As such a source, a pulsed neutron source (PNS) is used. In order to realize this technique, it is recommended to enable a PNS with a frequency of 1–20 Hz. The PNS is stopped after achieving a steady-state (on average) number of neutrons in the reactor volume. The change in the number of neutrons in the reactor volume is measured in time with an interval of discreteness of ∼0.1 s. The results of these measurements with the application of a system of point-kinetics equations are used in order to calculate the sought subcriticality. The basic idea of the proposed technique used to measure the subcriticality is elaborated in a series of experiments on the Kvant assembly. The conditions which should be implemented in order to obtain a positive result of measurements are formulated. A block diagram of the basic version of the experimental setup is presented, whose main element is a pulsed neutron generator.

  16. Visualizing epigenetics: current advances and advantages in HDAC PET imaging techniques.

    PubMed

    Wang, C; Schroeder, F A; Hooker, J M

    2014-04-04

    Abnormal gene regulation as a consequence of flawed epigenetic mechanisms may be central to the initiation and persistence of many human diseases. However, the association of epigenetic dysfunction with disease and the development of therapeutic agents for treatment are slow. Developing new methodologies used to visualize chromatin-modifying enzymes and their function in the human brain would be valuable for the diagnosis of brain disorders and drug discovery. We provide an overview of current invasive and noninvasive techniques for measuring expression and functions of chromatin-modifying enzymes in the brain, emphasizing tools applicable to histone deacetylase (HDAC) enzymes as a leading example. The majority of current techniques are invasive and difficult to translate to what is happening within a human brain in vivo. However, recent progress in molecular imaging provides new, noninvasive ways to visualize epigenetics in the human brain. Neuroimaging tool development presents a unique set of challenges in order to identify and validate CNS radiotracers for HDACs and other histone-modifying enzymes. We summarize advances in the effort to image HDACs and HDAC inhibitory effects in the brain using positron emission tomography (PET) and highlight generalizable techniques that can be adapted to investigate other specific components of epigenetic machinery. Translational tools like neuroimaging by PET and magnetic resonance imaging provide the best way to link our current understanding of epigenetic changes with in vivo function in normal and diseased brains. These tools will be a critical addition to ex vivo methods to evaluate - and intervene - in CNS dysfunction.

  17. Visualizing epigenetics: current advances and advantages in HDAC PET imaging techniques

    PubMed Central

    Wang, Changning; Schroeder, Frederick A.; Hooker, Jacob M.

    2013-01-01

    Abnormal gene regulation as a consequence of flawed epigenetic mechanisms may be central to the initiation and persistence of many human diseases. However, the association of epigenetic dysfunction with disease and the development of therapeutic agents for treatment are slow. Developing new methodologies used to visualize chromatin modifying enzymes and their function in the human brain would be valuable for diagnosis of brain disorders and drug discovery. We provide an overview of current invasive and noninvasive techniques for measuring expression and functions of chromatin modifying enzymes in the brain, emphasizing tools applicable to histone deacetylase (HDAC) enzymes as a leading example. The majority of current techniques are invasive and difficult to translate to what is happening within a human brain in vivo. However, recent progress in molecular imaging provides new, noninvasive ways to visualize epigenetics in human brain. Neuroimaging tool development presents a unique set of challenges in order to identify and validate CNS radiotracers for HDACs and other histone modifying enzymes. We summarize advances in the effort to image HDACs and HDAC inhibitory effects in the brain using PET and highlight generalizable techniques that can be adapted to investigate other specific components of epigenetic machinery. Translational tools like neuroimaging by PET and MRI provide the best way to link our current understanding of epigenetic changes with in vivo function in normal and diseased brain. These tools will be a critical addition to ex vivo methods to evaluate - and intervene - in CNS dysfunction. PMID:24051365

  18. Optical techniques for signal distribution and control in advanced radar and communication systems

    NASA Astrophysics Data System (ADS)

    Forrest, J. R.

    1985-03-01

    It is concluded that optical techniques offer some advantages for signal distribution and control in advanced radar and communication systems. They are clearly ideal for transporting microwave signals over considerable distances, as in remote positioning of radar receivers, provided high dynamic range is not required and an enclosed transmission path is essential. They are an elegant means of distributing low level r.f. or i.f. signals around an active phased array where these signals are of relatively constant amplitude (as in mixer local oscillator applications). However, there is currently a rather restrictive limit on the size of distribution network possible. Optical techniques are obviously suitable for distributing digital control signals to phased array modules and confer considerable immunity to interference. They are less suitable for high dynamic range signals, such as the received radar returns, either at r.f. or when downcovered to i.f. Future developments in coherent optics or in fast optical A/D technology could, however, influence this conclusion. Currently, the optimum applications for optical techniques appear to be i.f. beamformers for multibeam communication satellite systems and in calibration/monitoring systems for phased arrays.

  19. Comparative forensic soil analysis of New Jersey state parks using a combination of simple techniques with multivariate statistics.

    PubMed

    Bonetti, Jennifer; Quarino, Lawrence

    2014-05-01

    This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications.

  20. Effects of age, system experience, and navigation technique on driving with an advanced traveler information system.

    PubMed

    Dingus, T A; Hulse, M C; Mollenhauer, M A; Fleischman, R N; McGehee, D V; Manakkal, N

    1997-06-01

    This paper explores the effects of age, system experience, and navigation technique on driving, navigation performance, and safety for drivers who used TravTek, an Advanced Traveler Information System. The first two studies investigated various route guidance configurations on the road in a specially equipped instrumented vehicle with an experimenter present. The third was a naturalistic quasi-experimental field study that collected data unobtrusively from more than 1200 TravTek rental car drivers with no in-vehicle experimenter. The results suggest that with increased experience, drivers become familiar with the system and develop strategies for substantially more efficient and safer use. The results also showed that drivers over age 65 had difficulty driving and navigating concurrently. They compensated by driving slowly and more cautiously. Despite this increased caution, older drivers made more safety-related errors than did younger drivers. The results also showed that older drivers benefited substantially from a well-designed ATIS driver interface.

  1. Advances in the transient dc photocurrent technique for excited state dipole moment measurements

    SciTech Connect

    Smirnov, S.N.; Braun, C.L.

    1998-08-01

    Recent advances in the transient dc photocurrent technique for measuring excited state dipole moments, developed in our group, are discussed. A variety of approaches with detailed analyses of their advantages and disadvantages including cell design, circuit construction tricks, the data acquisition procedure, calibration, and the theoretical treatment of different conditions, are presented. Sensitivity, time resolution limitations, and newly developed features, such as the signal{close_quote}s dependence on light polarization as well as charge separation at interfaces are outlined. Dipole moments of a few molecules (diphenylcyclopropenone, bianthryl, dimethylaminonitrostilbene, Coumarin 153, and fluoroprobe) suitable for calibration purpose are reported{emdash}some of them for the first time. {copyright} {ital 1998 American Institute of Physics.}

  2. Vibrio parahaemolyticus: a review on the pathogenesis, prevalence, and advance molecular identification techniques.

    PubMed

    Letchumanan, Vengadesh; Chan, Kok-Gan; Lee, Learn-Han

    2014-01-01

    Vibrio parahaemolyticus is a Gram-negative halophilic bacterium that is found in estuarine, marine and coastal environments. V. parahaemolyticus is the leading causal agent of human acute gastroenteritis following the consumption of raw, undercooked, or mishandled marine products. In rare cases, V. parahaemolyticus causes wound infection, ear infection or septicaemia in individuals with pre-existing medical conditions. V. parahaemolyticus has two hemolysins virulence factors that are thermostable direct hemolysin (tdh)-a pore-forming protein that contributes to the invasiveness of the bacterium in humans, and TDH-related hemolysin (trh), which plays a similar role as tdh in the disease pathogenesis. In addition, the bacterium is also encodes for adhesions and type III secretion systems (T3SS1 and T3SS2) to ensure its survival in the environment. This review aims at discussing the V. parahaemolyticus growth and characteristics, pathogenesis, prevalence and advances in molecular identification techniques.

  3. Visualisation of Ecohydrological Processes and Relationships for Teaching Using Advanced Techniques

    NASA Astrophysics Data System (ADS)

    Guan, H.; Wang, H.; Gutierrez-Jurado, H. A.; Yang, Y.; Deng, Z.

    2014-12-01

    Ecohydrology is an emerging discipline with a rapid research growth. This calls for enhancing ecohydrology education in both undergraduate and postgraduate levels. In other hydrology disciplines, hydrological processes are commonly observed in environments (e.g. streamflow, infiltration) or easily demonstrated in labs (e.g. Darcy's column). It is relatively difficult to demonstrate ecohydrological concepts and processes (e.g. soil-vegetation water relationship) in teaching. In this presentation, we report examples of using some advanced techniques to illustrate ecohydrological concepts, relationships, and processes, with measurements based on a native vegetation catchment in South Australia. They include LIDAR images showing the relationship between topography-control hdyroclimatic conditions and vegetation distribution, electrical resistivity tomography derived images showing stem structures, continuous stem water potential monitoring showing diurnal variations of plant water status, root zone moisture depletion during dry spells, and responses to precipitation inputs, and incorporating sapflow measurements to demonstrate environmental stress on plant stomatal behaviours.

  4. Multielemental speciation analysis by advanced hyphenated technique - HPLC/ICP-MS: A review.

    PubMed

    Marcinkowska, Monika; Barałkiewicz, Danuta

    2016-12-01

    Speciation analysis has become an invaluable tool in human health risk assessment, environmental monitoring or food quality control. Another step is to develop reliable multielemental speciation methodologies, to reduce costs, waste and time needed for the analysis. Separation and detection of species of several elements in a single analytical run can be accomplished by high performance liquid chromatography hyphenated to inductively coupled plasma mass spectrometry (HPLC/ICP-MS). Our review assembles articles concerning multielemental speciation determination of: As, Se, Cr, Sb, I, Br, Pb, Hg, V, Mo, Te, Tl, Cd and W in environmental, biological, food and clinical samples analyzed with HPLC/ICP-MS. It addresses the procedures in terms of following issues: sample collection and pretreatment, selection of optimal conditions for elements species separation by HPLC and determination using ICP-MS as well as metrological approach. The presented work is the first review article concerning multielemental speciation analysis by advanced hyphenated technique HPLC/ICP-MS.

  5. Vibrio parahaemolyticus: a review on the pathogenesis, prevalence, and advance molecular identification techniques

    PubMed Central

    Letchumanan, Vengadesh; Chan, Kok-Gan; Lee, Learn-Han

    2014-01-01

    Vibrio parahaemolyticus is a Gram-negative halophilic bacterium that is found in estuarine, marine and coastal environments. V. parahaemolyticus is the leading causal agent of human acute gastroenteritis following the consumption of raw, undercooked, or mishandled marine products. In rare cases, V. parahaemolyticus causes wound infection, ear infection or septicaemia in individuals with pre-existing medical conditions. V. parahaemolyticus has two hemolysins virulence factors that are thermostable direct hemolysin (tdh)-a pore-forming protein that contributes to the invasiveness of the bacterium in humans, and TDH-related hemolysin (trh), which plays a similar role as tdh in the disease pathogenesis. In addition, the bacterium is also encodes for adhesions and type III secretion systems (T3SS1 and T3SS2) to ensure its survival in the environment. This review aims at discussing the V. parahaemolyticus growth and characteristics, pathogenesis, prevalence and advances in molecular identification techniques. PMID:25566219

  6. Efficient Boolean and multi-input flow techniques for advanced mask data processing

    NASA Astrophysics Data System (ADS)

    Salazar, Daniel; Moore, Bill; Valadez, John

    2012-11-01

    Mask data preparation (MDP) typically involves multiple flows, sometimes consisting of many steps to ensure that the data is properly written on the mask. This may include multiple inputs, transformations (scaling, orientation, etc.), and processing (layer extraction, sizing, Boolean operations, data filtering). Many MDP techniques currently in practice require multiple passes through the input data and/or multiple file I/O steps to achieve these goals. This paper details an approach which efficiently process the data, resulting in minimal I/O and greatly improved turnaround times (TAT). This approach takes advanced processing algorithms and adapts them to produce efficient and reliable data flow. In tandem with this processing flow, an internal jobdeck mapping approach, transparent to the user, allows an essentially unlimited number of pattern inputs to be handled in a single pass, resulting in increased flexibility and ease of use. Transformations and processing operations are critical to MDP. Transformations such as scaling, reverse tone and orientation, along with processing including sizing, Boolean operations and data filtering are key parts of this. These techniques are often employed in sequence and/or in parallel in a complex functional chain. While transformations typically are done "up front" when the data is input, processing is less straightforward, involving multiple reads and writes to handle the more intricate functionality and also the collection of input patterns which may be required to produce the data that comprises a single mask. The approach detailed in this paper consists of two complementary techniques: efficient MDP flow and jobdeck mapping. Efficient MDP flow is achieved by pipelining the output of each step to the input of the subsequent step. Rather than writing the output of a particular processing step to file and then reading it in to the following step, the pipelining or chaining of the steps results in an efficient flow with

  7. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  8. Development of advanced techniques for rotorcraft state estimation and parameter identification

    NASA Technical Reports Server (NTRS)

    Hall, W. E., Jr.; Bohn, J. G.; Vincent, J. H.

    1980-01-01

    An integrated methodology for rotorcraft system identification consists of rotorcraft mathematical modeling, three distinct data processing steps, and a technique for designing inputs to improve the identifiability of the data. These elements are as follows: (1) a Kalman filter smoother algorithm which estimates states and sensor errors from error corrupted data. Gust time histories and statistics may also be estimated; (2) a model structure estimation algorithm for isolating a model which adequately explains the data; (3) a maximum likelihood algorithm for estimating the parameters and estimates for the variance of these estimates; and (4) an input design algorithm, based on a maximum likelihood approach, which provides inputs to improve the accuracy of parameter estimates. Each step is discussed with examples to both flight and simulated data cases.

  9. Translation of Untranslatable Words — Integration of Lexical Approximation and Phrase-Table Extension Techniques into Statistical Machine Translation

    NASA Astrophysics Data System (ADS)

    Paul, Michael; Arora, Karunesh; Sumita, Eiichiro

    This paper proposes a method for handling out-of-vocabulary (OOV) words that cannot be translated using conventional phrase-based statistical machine translation (SMT) systems. For a given OOV word, lexical approximation techniques are utilized to identify spelling and inflectional word variants that occur in the training data. All OOV words in the source sentence are then replaced with appropriate word variants found in the training corpus, thus reducing the number of OOV words in the input. Moreover, in order to increase the coverage of such word translations, the SMT translation model is extended by adding new phrase translations for all source language words that do not have a single-word entry in the original phrase-table but only appear in the context of larger phrases. The effectiveness of the proposed methods is investigated for the translation of Hindi to English, Chinese, and Japanese.

  10. Advancements in sensing and perception using structured lighting techniques :an LDRD final report.

    SciTech Connect

    Novick, David Keith; Padilla, Denise D.; Davidson, Patrick A. Jr.; Carlson, Jeffrey J.

    2005-09-01

    This report summarizes the analytical and experimental efforts for the Laboratory Directed Research and Development (LDRD) project entitled ''Advancements in Sensing and Perception using Structured Lighting Techniques''. There is an ever-increasing need for robust, autonomous ground vehicles for counterterrorism and defense missions. Although there has been nearly 30 years of government-sponsored research, it is undisputed that significant advancements in sensing and perception are necessary. We developed an innovative, advanced sensing technology for national security missions serving the Department of Energy, the Department of Defense, and other government agencies. The principal goal of this project was to develop an eye-safe, robust, low-cost, lightweight, 3D structured lighting sensor for use in broad daylight outdoor applications. The market for this technology is wide open due to the unavailability of such a sensor. Currently available laser scanners are slow, bulky and heavy, expensive, fragile, short-range, sensitive to vibration (highly problematic for moving platforms), and unreliable for outdoor use in bright sunlight conditions. Eye-safety issues are a primary concern for currently available laser-based sensors. Passive, stereo-imaging sensors are available for 3D sensing but suffer from several limitations : computationally intensive, require a lighted environment (natural or man-made light source), and don't work for many scenes or regions lacking texture or with ambiguous texture. Our approach leveraged from the advanced capabilities of modern CCD camera technology and Center 6600's expertise in 3D world modeling, mapping, and analysis, using structured lighting. We have a diverse customer base for indoor mapping applications and this research extends our current technology's lifecycle and opens a new market base for outdoor 3D mapping. Applications include precision mapping, autonomous navigation, dexterous manipulation, surveillance and

  11. Assessment of surface water quality using multivariate statistical techniques: case study of the Nampong River and Songkhram River, Thailand.

    PubMed

    Muangthong, Somphinith; Shrestha, Sangam

    2015-09-01

    Multivariate statistical techniques such as cluster analysis (CA), principal component analysis (PCA), factor analysis (FA), and discriminant analysis (DA) were applied for the assessment of spatial and temporal variations of a large complex water quality data set of the Nampong River and Songkhram River, generated for more than 10 years (1996-2012) by monitoring of 16 parameters at different sites. According to the water quality characteristics, hierarchical CA grouped 13 sampling sites of the Nampong River into two clusters, i.e., upper stream (US) and lower stream (LS) sites, and five sampling sites of the Songkhram River into three clusters, i.e., upper stream (US), middle stream (MS) and lower stream (LS) sites. PCA/FA applied to the data sets thus obtained five latent factors explaining 69.80 and 69.32 % of the total variance in water quality data sets of LS and US areas, respectively, in the Nampong River and six latent factors explaining 80.80, 73.95, and 73.78 % of the total variance in water quality data sets of LS, MS, and US areas, respectively, in the Songkhram River. This study highlights the usefulness of multivariate statistical assessment of complex databases in the identification of pollution sources to better comprehend the spatial and temporal variations for effective river water quality management.

  12. Comparison of Statistical Estimation Techniques for Mars Entry, Descent, and Landing Reconstruction from MEDLI-like Data Sources

    NASA Technical Reports Server (NTRS)

    Dutta, Soumyo; Braun, Robert D.; Russell, Ryan P.; Clark, Ian G.; Striepe, Scott A.

    2012-01-01

    Flight data from an entry, descent, and landing (EDL) sequence can be used to reconstruct the vehicle's trajectory, aerodynamic coefficients and the atmospheric profile experienced by the vehicle. Past Mars missions have contained instruments that do not provide direct measurement of the freestream atmospheric conditions. Thus, the uncertainties in the atmospheric reconstruction and the aerodynamic database knowledge could not be separated. The upcoming Mars Science Laboratory (MSL) will take measurements of the pressure distribution on the aeroshell forebody during entry and will allow freestream atmospheric conditions to be partially observable. This data provides a mean to separate atmospheric and aerodynamic uncertainties and is part of the MSL EDL Instrumentation (MEDLI) project. Methods to estimate the flight performance statistically using on-board measurements are demonstrated here through the use of simulated Mars data. Different statistical estimators are used to demonstrate which estimator best quantifies the uncertainties in the flight parameters. The techniques demonstrated herein are planned for application to the MSL flight dataset after the spacecraft lands on Mars in August 2012.

  13. Nanostructural defects evidenced in failing silicon-based NMOS capacitors by advanced failure analysis techniques

    NASA Astrophysics Data System (ADS)

    Faivre, Emilie; Llido, Roxane; Putero, Magali; Fares, Lahouari; Muller, Christophe

    2014-04-01

    An experimental methodology compliant with industrial constraints was deployed to uncover the origin of soft breakdown events in large planar silicon-based NMOS capacitors. Complementary advanced failure analysis techniques were advantageously employed to localize, isolate and observe structural defects at nanoscale. After an accurate localization of the failing area by optical beam-induced resistance change (OBIRCH), focused ion beam (FIB) technique enabled preparing thin specimens adequate for transmission electron microscopy (TEM). Characterization of the gate oxide microstructure was performed by highresolution TEM imaging and energy-filtered spectroscopy. A dedicated experimental protocol relying on iterative FIB thinning and TEM observation enabled improving the quality of electron imaging of defects at atom scale. In that way, the gate oxide integrity was evaluated and an electrical stress-induced silicon epitaxy was detected concomitantly to soft breakdown events appearing during constant voltage stress. The growth of silicon hillocks enables consuming a part of the breakdown energy and may prevent the soft breakdown event to evolve towards a hard breakdown that is catastrophic for device functionality.

  14. Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets

    NASA Technical Reports Server (NTRS)

    Mathur, Rohit

    1997-01-01

    This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.

  15. Development of Advanced In-Situ Techniques for Chemistry Monitoring and Corrosion Mitigation in SCWO Environments

    SciTech Connect

    Macdonald, D. D.; Lvov, S. N.

    2000-03-31

    This project is developing sensing technologies and corrosion monitoring techniques for use in super critical water oxidation (SCWO) systems to reduce the volume of mixed low-level nuclear waste by oxidizing organic components in a closed cycle system where CO2 and other gaseous oxides are produced, leaving the radioactive elements concentrated in ash. The technique uses water at supercritical temperatures under highly oxidized conditions by maintaining a high fugacity of molecular oxygen in the system, which causes high corrosion rates of even the most corrosive resistant reactor materials. This project significantly addresses the high corrosion shortcoming through development of (a) advanced electrodes and sensors for in situ potentiometric monitoring of pH in high subcritical and supercritical aqueous solutions, (b) an approach for evaluating the association constants for 1-1 aqueous electrolytes using a flow-through electrochemical thermocell; (c) an electrochemical noise sensor for the in situ measurement of corrosion rate in subcritical and supercritical aqueous systems; (d) a model for estimating the effect of pressure on reaction rates, including corrosion reactions, in high subcritical and supercritical aqueous systems. The project achieved all objectives, except for installing some of the sensors into a fully operating SCWO system.

  16. Advancing the frontiers in nanocatalysis, biointerfaces, and renewable energy conversion by innovations of surface techniques.

    PubMed

    Somorjai, Gabor A; Frei, Heinz; Park, Jeong Y

    2009-11-25

    The challenge of chemistry in the 21st century is to achieve 100% selectivity of the desired product molecule in multipath reactions ("green chemistry") and develop renewable energy based processes. Surface chemistry and catalysis play key roles in this enterprise. Development of in situ surface techniques such as high-pressure scanning tunneling microscopy, sum frequency generation (SFG) vibrational spectroscopy, time-resolved Fourier transform infrared methods, and ambient pressure X-ray photoelectron spectroscopy enabled the rapid advancement of three fields: nanocatalysts, biointerfaces, and renewable energy conversion chemistry. In materials nanoscience, synthetic methods have been developed to produce monodisperse metal and oxide nanoparticles (NPs) in the 0.8-10 nm range with controlled shape, oxidation states, and composition; these NPs can be used as selective catalysts since chemical selectivity appears to be dependent on all of these experimental parameters. New spectroscopic and microscopic techniques have been developed that operate under reaction conditions and reveal the dynamic change of molecular structure of catalysts and adsorbed molecules as the reactions proceed with changes in reaction intermediates, catalyst composition, and oxidation states. SFG vibrational spectroscopy detects amino acids, peptides, and proteins adsorbed at hydrophobic and hydrophilic interfaces and monitors the change of surface structure and interactions with coadsorbed water. Exothermic reactions and photons generate hot electrons in metal NPs that may be utilized in chemical energy conversion. The photosplitting of water and carbon dioxide, an important research direction in renewable energy conversion, is discussed.

  17. Advanced system identification techniques for wind turbine structures with special emphasis on modal parameters

    NASA Astrophysics Data System (ADS)

    Bialasiewicz, J. T.

    1995-06-01

    The goal is to develop advanced system identification techniques that can be used to accurately measure the frequency response functions of a wind-turbine structure immersed in wind noise. To allow for accurate identification, the authors have developed a special test signal called the pseudo-random binary sequence (PRBS). The Matlab program that generates this signal allows the user to interactively tailor its parameters for the frequency range of interest based on the response of the wind turbine under test. By controlling NREL's Mobile Hydraulic Shaker System, which is attached to the wind turbine structure, the PRBS signal produces the wide-band excitation necessary to perform system identification in the presence of wind noise. The techniques presented here will enable researchers to obtain modal parameters from an operating wind turbine, including frequencies, damping coefficients, and mode shapes. More importantly, the algorithms they have developed and tested (so far using input-output data from a simulated structure) permit state-space representation of the system under test, particularly the modal state space representation. This is the only system description that reveals the internal behavior of the system, such as the interaction between the physical parameters, and which, in contrast to transfer functions, is valid for non-zero initial conditions.

  18. Investigation to advance prediction techniques of the low-speed aerodynamics of V/STOL aircraft

    NASA Technical Reports Server (NTRS)

    Maskew, B.; Strash, D.; Nathman, J.; Dvorak, F. A.

    1985-01-01

    A computer program, VSAERO, has been applied to a number of V/STOL configurations with a view to advancing prediction techniques for the low-speed aerodynamic characteristics. The program couples a low-order panel method with surface streamline calculation and integral boundary layer procedures. The panel method--which uses piecewise constant source and doublet panels-includes an iterative procedure for wake shape and models boundary layer displacement effect using the source transpiration technique. Certain improvements to a basic vortex tube jet model were installed in the code prior to evaluation. Very promising results were obtained for surface pressures near a jet issuing at 90 deg from a flat plate. A solid core model was used in the initial part of the jet with a simple entrainment model. Preliminary representation of the downstream separation zone significantly improve the correlation. The program accurately predicted the pressure distribution inside the inlet on the Grumman 698-411 design at a range of flight conditions. Furthermore, coupled viscous/potential flow calculations gave very close correlation with experimentally determined operational boundaries dictated by the onset of separation inside the inlet. Experimentally observed degradation of these operational boundaries between nacelle-alone tests and tests on the full configuration were also indicated by the calculation. Application of the program to the General Dynamics STOL fighter design were equally encouraging. Very close agreement was observed between experiment and calculation for the effects of power on pressure distribution, lift and lift curve slope.

  19. Advancing the Frontiers in Nanocatalysis, Biointerfaces, and Renewable Energy Conversion by Innovations of Surface Techniques

    SciTech Connect

    Somorjai, G.A.; Frei, H.; Park, J.Y.

    2009-07-23

    The challenge of chemistry in the 21st century is to achieve 100% selectivity of the desired product molecule in multipath reactions ('green chemistry') and develop renewable energy based processes. Surface chemistry and catalysis play key roles in this enterprise. Development of in situ surface techniques such as high-pressure scanning tunneling microscopy, sum frequency generation (SFG) vibrational spectroscopy, time-resolved Fourier transform infrared methods, and ambient pressure X-ray photoelectron spectroscopy enabled the rapid advancement of three fields: nanocatalysts, biointerfaces, and renewable energy conversion chemistry. In materials nanoscience, synthetic methods have been developed to produce monodisperse metal and oxide nanoparticles (NPs) in the 0.8-10 nm range with controlled shape, oxidation states, and composition; these NPs can be used as selective catalysts since chemical selectivity appears to be dependent on all of these experimental parameters. New spectroscopic and microscopic techniques have been developed that operate under reaction conditions and reveal the dynamic change of molecular structure of catalysts and adsorbed molecules as the reactions proceed with changes in reaction intermediates, catalyst composition, and oxidation states. SFG vibrational spectroscopy detects amino acids, peptides, and proteins adsorbed at hydrophobic and hydrophilic interfaces and monitors the change of surface structure and interactions with coadsorbed water. Exothermic reactions and photons generate hot electrons in metal NPs that may be utilized in chemical energy conversion. The photosplitting of water and carbon dioxide, an important research direction in renewable energy conversion, is discussed.

  20. Characterization and detection of Vero cells infected with Herpes Simplex Virus type 1 using Raman spectroscopy and advanced statistical methods.

    PubMed

    Salman, A; Shufan, E; Zeiri, L; Huleihel, M

    2014-07-01

    Herpes viruses are involved in a variety of human disorders. Herpes Simplex Virus type 1 (HSV-1) is the most common among the herpes viruses and is primarily involved in human cutaneous disorders. Although the symptoms of infection by this virus are usually minimal, in some cases HSV-1 might cause serious infections in the eyes and the brain leading to blindness and even death. A drug, acyclovir, is available to counter this virus. The drug is most effective when used during the early stages of the infection, which makes early detection and identification of these viral infections highly important for successful treatment. In the present study we evaluated the potential of Raman spectroscopy as a sensitive, rapid, and reliable method for the detection and identification of HSV-1 viral infections in cell cultures. Using Raman spectroscopy followed by advanced statistical methods enabled us, with sensitivity approaching 100%, to differentiate between a control group of Vero cells and another group of Vero cells that had been infected with HSV-1. Cell sites that were "rich in membrane" gave the best results in the differentiation between the two categories. The major changes were observed in the 1195-1726 cm(-1) range of the Raman spectrum. The features in this range are attributed mainly to proteins, lipids, and nucleic acids.

  1. EPS in Environmental Microbial Biofilms as Examined by Advanced Imaging Techniques

    NASA Astrophysics Data System (ADS)

    Neu, T. R.; Lawrence, J. R.

    2006-12-01

    Biofilm communities are highly structured associations of cellular and polymeric components which are involved in biogenic and geogenic environmental processes. Furthermore, biofilms are also important in medical (infection), industrial (biofouling) and technological (biofilm engineering) processes. The interfacial microbial communities in a specific habitat are highly dynamic and change according to the environmental parameters affecting not only the cellular but also the polymeric constituents of the system. Through their EPS biofilms interact with dissolved, colloidal and particulate compounds from the bulk water phase. For a long time the focus in biofilm research was on the cellular constituents in biofilms and the polymer matrix in biofilms has been rather neglected. The polymer matrix is produced not only by different bacteria and archaea but also by eukaryotic micro-organisms such as algae and fungi. The mostly unidentified mixture of EPS compounds is responsible for many biofilm properties and is involved in biofilm functionality. The chemistry of the EPS matrix represents a mixture of polymers including polysaccharides, proteins, nucleic acids, neutral polymers, charged polymers, amphiphilic polymers and refractory microbial polymers. The analysis of the EPS may be done destructively by means of extraction and subsequent chemical analysis or in situ by means of specific probes in combination with advanced imaging. In the last 15 years laser scanning microscopy (LSM) has been established as an indispensable technique for studying microbial communities. LSM with 1-photon and 2-photon excitation in combination with fluorescence techniques allows 3-dimensional investigation of fully hydrated, living biofilm systems. This approach is able to reveal data on biofilm structural features as well as biofilm processes and interactions. The fluorescent probes available allow the quantitative assessment of cellular as well as polymer distribution. For this purpose

  2. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  3. Recent Advances in Stable Isotope Techniques for N2O Source Partitioning in Soils

    NASA Astrophysics Data System (ADS)

    Baggs, E.; Mair, L.; Mahmood, S.

    2007-12-01

    The use of 13C, 15N and 18O enables us to overcome uncertainties associated with soil C and N processes and to assess the links between species diversity and ecosystem function. Recent advances in stable isotope techniques enable determination of process rates, and are fundamental for examining interactions between C and N cycles. Here we will introduce the 15N-, 18O- and 13C-enrichment techniques we have developed to distinguish between different N2O-producing processes in situ in soils, presenting selected results, and will critically assess their potential, alone and in combination with molecular techniques, to help address key research questions for soil biogeochemistry and microbial ecology. We have developed 15N- 18O-enrichment techniques to distinguish between, and to quantify, N2O production during ammonia oxidation, nitrifier denitrification and denitrification. This provides a great advantage over natural abundance approaches as it enables quantification of N2O from each microbial source, which can be coupled with quantification of N2 production, and used to examine interactions between different processes and cycles. These approaches have also provided new insights into the N cycle and how it interacts with the C cycle. For example, we now know that ammonia oxidising bacteria significantly contribute to N2O emissions from soils, both via the traditionally accepted ammonia oxidation pathway, and also via denitrification (nitrifier denitrification) which can proceed even under aerobic conditions. We are also linking emissions from each source to diversity and activity of relevant microbial functional groups, for example through the development and application of a specific nirK primer for the nitrite reductase in ammonia oxidising bacteria. Recently, isotopomers have been proposed as an alternative for source partitioning N2O at natural abundance levels, and offers the potential to investigate N2O production from nitrate ammonification, and overcomes the

  4. A statistical approach for site error correction in lightning location networks with DF/TOA technique and its application results

    NASA Astrophysics Data System (ADS)

    Lu, Tao; Chen, Mingli; Du, Yaping; Qiu, Zongxu

    2017-02-01

    Lightning location network (LLN) with DF/TOA (direction-finder/time-of-arrival) combined technique has been widely used in the world. However, the accuracy of the lightning data from such LLNs has still been restricted by "site error", especially for those detected only by two DF/TOA sensors. In this paper we practice a statistical approach for evaluation and correction of "site error" for DF/TOA type LLN based on its lightning data. By comparing lightning locations recorded by at least 4 sensors between DF and TOA techniques, the spatial characteristics of "site error" for each sensor in the network can be obtained. The obtained "site error" then can be used to improve the accuracy of lightning locations especially those recorded by only 2 sensors. With this approach, the "site error" patterns for 23 sensors in Yunnan LLN are obtained. The features of these site error patterns are in good consistency with those in literature. Significant differences in lightning locations before and after "site error" corrections indicate that the proposed approach works effectively.

  5. The problem of sexual imbalance and techniques of the self in the Diagnostic and Statistical Manual of Mental Disorders.

    PubMed

    Flore, Jacinthe

    2016-09-01

    This article examines the problematization of sexual appetite and its imbalances in the development of the Diagnostic and Statistical Manual of Mental Disorders (DSM) in the twentieth and twenty-first centuries. The dominant strands of historiographies of sexuality have focused on historicizing sexual object choice and understanding the emergence of sexual identities. This article emphasizes the need to contextualize these histories within a broader frame of historical interest in the problematization of sexual appetite. The first part highlights how sexual object choice, as a paradigm of sexual dysfunctions, progressively receded from medical interest in the twentieth century as the clinical gaze turned to the problem of sexual appetite and its imbalances. The second part uses the example of the newly introduced Female Sexual Interest/Arousal Disorder in the DSM-5 to explore how the Manual functions as a technique for taking care of the self. I argue that the design of the Manual and associated inventories and questionnaires paved the way for their interpretation and application as techniques for self-examination.

  6. Identification and comparison of electrical tapes using instrumental and statistical techniques: I. Microscopic surface texture and elemental composition.

    PubMed

    Goodpaster, John V; Sturdevant, Amanda B; Andrews, Kristen L; Brun-Conti, Leanora

    2007-05-01

    Comparisons of polyvinyl chloride electrical tape typically rely upon evaluating class characteristics such as physical dimensions, surface texture, and chemical composition. Given the various techniques that are available for this purpose, a comprehensive study has been undertaken to establish an optimal analytical scheme for electrical tape comparisons. Of equal importance is the development of a quantitative means for sample discrimination. In this study, 67 rolls of black electrical tape representing 34 different nominal brands were analyzed via scanning electron microscopy and energy dispersive spectroscopy. Differences in surface roughness, calendering marks, and filler particle size were readily apparent, including between some rolls of the same nominal brand. The relative amounts of magnesium, aluminum, silicon, sulfur, lead, chlorine, antimony, calcium, titanium, and zinc varied greatly between brands and, in some cases, could be linked to the year of manufacture. For the first time, quantitative differentiation of electrical tapes was achieved through multivariate statistical techniques, with 36 classes identified within the sample population. A single-blind study was also completed where questioned tape samples were correctly associated with known exemplars. Finally, two case studies are presented where tape recovered from an improvised explosive device is compared with tape recovered from a suspect.

  7. Statistically advanced, self-similar, radial probability density functions of atmospheric and under-expanded hydrogen jets

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam J.

    2015-11-01

    This paper presents improved statistical insight regarding the self-similar scalar mixing process of atmospheric hydrogen jets and the downstream region of under-expanded hydrogen jets. Quantitative planar laser Rayleigh scattering imaging is used to probe both jets. The self-similarity of statistical moments up to the sixth order (beyond the literature established second order) is documented in both cases. This is achieved using a novel self-similar normalization method that facilitated a degree of statistical convergence that is typically limited to continuous, point-based measurements. This demonstrates that image-based measurements of a limited number of samples can be used for self-similar scalar mixing studies. Both jets exhibit the same radial trends of these moments demonstrating that advanced atmospheric self-similarity can be applied in the analysis of under-expanded jets. Self-similar histograms away from the centerline are shown to be the combination of two distributions. The first is attributed to turbulent mixing. The second, a symmetric Poisson-type distribution centered on zero mass fraction, progressively becomes the dominant and eventually sole distribution at the edge of the jet. This distribution is attributed to shot noise-affected pure air measurements, rather than a diffusive superlayer at the jet boundary. This conclusion is reached after a rigorous measurement uncertainty analysis and inspection of pure air data collected with each hydrogen data set. A threshold based upon the measurement noise analysis is used to separate the turbulent and pure air data, and thusly estimate intermittency. Beta-distributions (four parameters) are used to accurately represent the turbulent distribution moments. This combination of measured intermittency and four-parameter beta-distributions constitutes a new, simple approach to model scalar mixing. Comparisons between global moments from the data and moments calculated using the proposed model show excellent

  8. Exposure to Alcoholism in the Family: United States, 1988. Advance Data from Vital and Health Statistics of the National Center for Health Statistics. Number 205.

    ERIC Educational Resources Information Center

    Schoenborn, Charlotte A.

    This report is based on data from the 1988 National Health Interview Survey on Alcohol (NHIS-Alcohol), part of the ongoing National Health Interview Survey conducted by the National Center for Health Statistics. Interviews for the NHIS are conducted in person by staff of the United States Bureau of the Census. Information is collected on each…

  9. Craniospinal Irradiation Techniques: A Dosimetric Comparison of Proton Beams With Standard and Advanced Photon Radiotherapy

    SciTech Connect

    Yoon, Myonggeun; Shin, Dong Ho; Kim, Jinsung; Kim, Jong Won; Kim, Dae Woong; Park, Sung Yong; Lee, Se Byeong; Kim, Joo Young; Park, Hyeon-Jin; Park, Byung Kiu; Shin, Sang Hoon

    2011-11-01

    Purpose: To evaluate the dosimetric benefits of advanced radiotherapy techniques for craniospinal irradiation in cancer in children. Methods and Materials: Craniospinal irradiation (CSI) using three-dimensional conformal radiotherapy (3D-CRT), tomotherapy (TOMO), and proton beam treatment (PBT) in the scattering mode was planned for each of 10 patients at our institution. Dosimetric benefits and organ-specific radiation-induced cancer risks were based on comparisons of dose-volume histograms (DVHs) and on the application of organ equivalent doses (OEDs), respectively. Results: When we analyzed the organ-at-risk volumes that received 30%, 60%, and 90% of the prescribed dose (PD), we found that PBT was superior to TOMO and 3D-CRT. On average, the doses delivered by PBT to the esophagus, stomach, liver, lung, pancreas, and kidney were 19.4 Gy, 0.6 Gy, 0.3 Gy, 2.5 Gy, 0.2 Gy, and 2.2 Gy for the PD of 36 Gy, respectively, which were significantly lower than the doses delivered by TOMO (22.9 Gy, 4.5 Gy, 6.1 Gy, 4.0 Gy, 13.3 Gy, and 4.9 Gy, respectively) and 3D-CRT (34.6 Gy, 3.6 Gy, 8.0 Gy, 4.6 Gy, 22.9 Gy, and 4.3 Gy, respectively). Although the average doses delivered by PBT to the chest and abdomen were significantly lower than those of 3D-CRT or TOMO, these differences were reduced in the head-and-neck region. OED calculations showed that the risk of secondary cancers in organs such as the stomach, lungs, thyroid, and pancreas was much higher when 3D-CRT or TOMO was used than when PBT was used. Conclusions: Compared with photon techniques, PBT showed improvements in most dosimetric parameters for CSI patients, with lower OEDs to organs at risk.

  10. Application of Energy Integration Techniques to the Design of Advanced Life Support Systems

    NASA Technical Reports Server (NTRS)

    Levri, Julie; Finn, Cory

    2000-01-01

    Exchanging heat between hot and cold streams within an advanced life support system can save energy. This savings will reduce the equivalent system mass (ESM) of the system. Different system configurations are examined under steady-state conditions for various percentages of food growth and waste treatment. The scenarios investigated represent possible design options for a Mars reference mission. Reference mission definitions are drawn from the ALSS Modeling and Analysis Reference Missions Document, which includes definitions for space station evolution, Mars landers, and a Mars base. For each scenario, streams requiring heating or cooling are identified and characterized by mass flow, supply and target temperatures and heat capacities. The Pinch Technique is applied to identify good matches for energy exchange between the hot and cold streams and to calculate the minimum external heating and cooling requirements for the system. For each pair of hot and cold streams that are matched, there will be a reduction in the amount of external heating and cooling required, and the original heating and cooling equipment will be replaced with a heat exchanger. The net cost savings can be either positive or negative for each stream pairing, and the priority for implementing each pairing can be ranked according to its potential cost savings. Using the Pinch technique, a complete system heat exchange network is developed and heat exchangers are sized to allow for calculation of ESM. The energy-integrated design typically has a lower total ESM than the original design with no energy integration. A comparison of ESM savings in each of the scenarios is made to direct future Pinch Analysis efforts.

  11. Differences in paleomagnetic interpretations due to the choice of statistical, demagnetization and correction techniques: Kapuskasing Structural Zone, northern Ontario, Canada

    NASA Astrophysics Data System (ADS)

    Borradaile, Graham J.; Werner, Tomasz; Lagroix, France

    2003-02-01

    The Kapuskasing Structural Zone (KSZ) reveals a section through the Archean lower crustal granoblastic gneisses. Our new paleomagnetic data largely agree with previous work but we show that interpretations vary according to the choices of statistical, demagnetization and field-correction techniques. First, where the orientation distribution of characteristic remanence directions on the sphere is not symmetrically circular, the commonly used statistical model is invalid [Fisher, R.A., Proc. R. Soc. A217 (1953) 295]. Any tendency to form an elliptical distribution indicates that the sample is drawn from a Bingham-type population [Bingham, C., 1964. Distributions on the sphere and on the projective plane. PhD thesis, Yale University]. Fisher and Bingham statistics produce different confidence estimates from the same data and the traditionally defined mean vector may differ from the maximum eigenvector of an orthorhombic Bingham distribution. It seems prudent to apply both models wherever a non-Fisher population is suspected and that may be appropriate in any tectonized rocks. Non-Fisher populations require larger sample sizes so that focussing on individual sites may not be the most effective policy in tectonized rocks. More dispersed sampling across tectonic structures may be more productive. Second, from the same specimens, mean vectors isolated by thermal and alternating field (AF) demagnetization differ. Which treatment gives more meaningful results is difficult to decipher, especially in metamorphic rocks where the history of the magnetic minerals is not easily related to the ages of tectonic and petrological events. In this study, thermal demagnetization gave lower inclinations for paleomagnetic vectors and thus more distant paleopoles. Third, of more parochial significance, tilt corrections may be unnecessary in the KSZ because magnetic fabrics and thrust ramp are constant in orientation to the depth at which they level off, at approximately 15-km depth. With

  12. Advanced Treatment Monitoring for Olympic-Level Athletes Using Unsupervised Modeling Techniques

    PubMed Central

    Siedlik, Jacob A.; Bergeron, Charles; Cooper, Michael; Emmons, Russell; Moreau, William; Nabhan, Dustin; Gallagher, Philip; Vardiman, John P.

    2016-01-01

    Context Analysis of injury and illness data collected at large international competitions provides the US Olympic Committee and the national governing bodies for each sport with information to best prepare for future competitions. Research in which authors have evaluated medical contacts to provide the expected level of medical care and sports medicine services at international competitions is limited. Objective To analyze the medical-contact data for athletes, staff, and coaches who participated in the 2011 Pan American Games in Guadalajara, Mexico, using unsupervised modeling techniques to identify underlying treatment patterns. Design Descriptive epidemiology study. Setting Pan American Games. Patients or Other Participants A total of 618 US athletes (337 males, 281 females) participated in the 2011 Pan American Games. Main Outcome Measure(s) Medical data were recorded from the injury-evaluation and injury-treatment forms used by clinicians assigned to the central US Olympic Committee Sport Medicine Clinic and satellite locations during the operational 17-day period of the 2011 Pan American Games. We used principal components analysis and agglomerative clustering algorithms to identify and define grouped modalities. Lift statistics were calculated for within-cluster subgroups. Results Principal component analyses identified 3 components, accounting for 72.3% of the variability in datasets. Plots of the principal components showed that individual contacts focused on 4 treatment clusters: massage, paired manipulation and mobilization, soft tissue therapy, and general medical. Conclusions Unsupervised modeling techniques were useful for visualizing complex treatment data and provided insights for improved treatment modeling in athletes. Given its ability to detect clinically relevant treatment pairings in large datasets, unsupervised modeling should be considered a feasible option for future analyses of medical-contact data from international competitions. PMID

  13. Advanced techniques and painless procedures for nonlinear contact analysis and forming simulation via implicit FEM

    NASA Astrophysics Data System (ADS)

    Zhuang, Shoubing

    2013-05-01

    Nonlinear contact analysis including forming simulation via finite element methods has a crucial and practical application in many engineering fields. However, because of high nonlinearity, nonlinear contact analysis still remains as an extremely challenging obstacle for many industrial applications. The implicit finite element scheme is generally more accurate than the explicit finite element scheme, but it has a known challenge of convergence because of complex geometries, large relative motion and rapid contact state change. It might be thought as a very painful process to diagnose the convergence issue of nonlinear contact. Most complicated contact models have a great many contact surfaces, and it is hard work to well define the contact pairs using the common contact definition methods, which either result in hundreds of contact pairs or are time-consuming. This paper presents the advanced techniques of nonlinear contact analysis and forming simulation via the implicit finite element scheme and the penalty method. The calculation of the default automatic contact stiffness is addressed. Furthermore, this paper presents the idea of selection groups to help easily and efficiently define contact pairs for complicated contact analysis, and the corresponding implementation and usage are discussed. Lastly, typical nonlinear contact models and forming models with nonlinear material models are shown in the paper to demonstrate the key presented method and technologies.

  14. Simultaneous evaluation of prepulse inhibition with EMG and EEG using advanced artifact removal techniques.

    PubMed

    Fraga, Francisco J; Noya, Claudemiro V; Zimiani, Maria I; Avila, Milton A; Shuhama, Rosana; Del-Ben, Cristina M; Menezes, Paulo R; Martin, Rodrigo S; Salum, Cristiane

    2016-08-01

    Prepulse inhibition (PPI) consists of a reduction of the acoustic startle reflex (SR) magnitude (measured with EMG) when a startling stimulus is preceded by a non-startling one. This behavior has been extensively investigated in studies related to schizophrenia, since sensory-motor deficit plays a central role in its pathophysiology. However, the same auditory stimuli that trigger the SR also provoke intense auditory evoked responses (AEP), which can be measured with EEG. Comparing these two types of responses, acquired simultaneously, is a great opportunity to investigate the dependence and interdependence of their neural pathways. Nonetheless, so far very few studies have dared to perform such simultaneous recordings, because SR produces strong eye blinks and muscle contraction artifacts that contaminate EEG electrodes placed on the scalp. In this study we investigated the possibility of simultaneously obtaining both the acoustic SR (using EMG) and the AEP (using EEG) measures, through the use of advanced artifact removal techniques, to better characterize PPI in healthy humans.

  15. Analysis of deformation patterns through advanced DINSAR techniques in Istanbul megacity

    NASA Astrophysics Data System (ADS)

    Balik Sanli, F.; Calò, F.; Abdikan, S.; Pepe, A.; Gorum, T.

    2014-09-01

    As result of the Turkey's economic growth and heavy migration processes from rural areas, Istanbul has experienced a high urbanization rate, with severe impacts on the environment in terms of natural resources pressure, land-cover changes and uncontrolled sprawl. As a consequence, the city became extremely vulnerable to natural and man-made hazards, inducing ground deformation phenomena that threaten buildings and infrastructures and often cause significant socio-economic losses. Therefore, the detection and monitoring of such deformation patterns is of primary importance for hazard and risk assessment as well as for the design and implementation of effective mitigation strategies. Aim of this work is to analyze the spatial distribution and temporal evolution of deformations affecting the Istanbul metropolitan area, by exploiting advanced Differential SAR Interferometry (DInSAR) techniques. In particular, we apply the Small BAseline Subset (SBAS) approach to a dataset of 43 TerraSAR-X images acquired, between November 2010 and June 2012, along descending orbits with an 11-day revisit time and a 3 m × 3 m spatial resolution. The SBAS processing allowed us to remotely detect and monitor subsidence patterns over all the urban area as well as to provide detailed information at the scale of the single building. Such SBAS measurements, effectively integrated with ground-based monitoring data and thematic maps, allows to explore the relationship between the detected deformation phenomena and urbanization, contributing to improve the urban planning and management.

  16. Nanocasting technique to prepare lotus-leaf-like superhydrophobic electroactive polyimide as advanced anticorrosive coatings.

    PubMed

    Chang, Kung-Chin; Lu, Hsin-I; Peng, Chih-Wei; Lai, Mei-Chun; Hsu, Sheng-Chieh; Hsu, Min-Hsiang; Tsai, Yuan-Kai; Chang, Chi-Hao; Hung, Wei-I; Wei, Yen; Yeh, Jui-Ming

    2013-02-01

    Nanocasting technique was used to obtain a biomimetic superhydrophobic electroactive polyimide (SEPI) surface structure from a natural Xanthosoma sagittifolium leaf. An electroactive polyimide (EPI) was first synthesized through thermal imidization. An impression of the superhydrophobic Xanthosoma sagittifolium leaf was then nanocasted onto the surface of the EPI so that the resulting EPI was superhydrophobic and would prevent corrosion. Polydimethylsiloxane (PDMS) was then used as a negative template to transfer the impression of the superhydrophobic surface of the biomimetic EPI onto a cold-rolled steel (CRS) electrode. The superhydrophobic electroactive material could be used as advanced coatings that protect metals against corrosion. The morphology of the surface of the as-synthesized SEPI coating was investigated using scanning electron microscopy (SEM). The surface showed numerous micromastoids, each decorated with many nanowrinkles. The water contact angle (CA) for the SEPI coating was 155°, which was significantly larger than that for the EPI coating (i.e., CA = 87°). The significant increase in the contact angle indicated that the biomimetic morphology effectively repelled water. Potentiodynamic and electrochemical impedance spectroscopic measurements indicated that the SEPI coating offered better protection against corrosion than the EPI coating did.

  17. Recent Advance in Liquid Chromatography/Mass Spectrometry Techniques for Environmental Analysis in Japan

    PubMed Central

    Suzuki, Shigeru

    2014-01-01

    The techniques and measurement methods developed in the Environmental Survey and Monitoring of Chemicals by Japan’s Ministry of the Environment, as well as a large amount of knowledge archived in the survey, have led to the advancement of environmental analysis. Recently, technologies such as non-target liquid chromatography/high resolution mass spectrometry and liquid chromatography with micro bore column have further developed the field. Here, the general strategy of a method developed for the liquid chromatography/mass spectrometry (LC/MS) analysis of environmental chemicals with a brief description is presented. Also, a non-target analysis for the identification of environmental pollutants using a provisional fragment database and “MsMsFilter,” an elemental composition elucidation tool, is presented. This analytical method is shown to be highly effective in the identification of a model chemical, the pesticide Bendiocarb. Our improved micro-liquid chromatography injection system showed substantially enhanced sensitivity to perfluoroalkyl substances, with peak areas 32–71 times larger than those observed in conventional LC/MS. PMID:26819891

  18. The development of optical microscopy techniques for the advancement of single-particle studies

    SciTech Connect

    Marchuk, Kyle

    2013-05-15

    Single particle orientation and rotational tracking (SPORT) has recently become a powerful optical microscopy tool that can expose many molecular motions. Unfortunately, there is not yet a single microscopy technique that can decipher all particle motions in all environmental conditions, thus there are limitations to current technologies. Within, the two powerful microscopy tools of total internal reflection and interferometry are advanced to determine the position, orientation, and optical properties of metallic nanoparticles in a variety of environments. Total internal reflection is an optical phenomenon that has been applied to microscopy to produce either fluorescent or scattered light. The non-invasive far-field imaging technique is coupled with a near-field illumination scheme that allows for better axial resolution than confocal microscopy and epi-fluorescence microscopy. By controlling the incident illumination angle using total internal reflection fluorescence (TIRF) microscopy, a new type of imaging probe called “non-blinking” quantum dots (NBQDs) were super-localized in the axial direction to sub-10-nm precision. These particles were also used to study the rotational motion of microtubules being propelled by the motor protein kinesin across the substrate surface. The same instrument was modified to function under total internal reflection scattering (TIRS) microscopy to study metallic anisotropic nanoparticles and their dynamic interactions with synthetic lipid bilayers. Utilizing two illumination lasers with opposite polarization directions at wavelengths corresponding to the short and long axis surface plasmon resonance (SPR) of the nanoparticles, both the in-plane and out-of-plane movements of many particles could be tracked simultaneously. When combined with Gaussian point spread function (PSF) fitting for particle super-localization, the binding status and rotational movement could be resolved without degeneracy. TIRS microscopy was also used to

  19. Application of multivariate statistical techniques in the assessment of water quality in the Southwest New Territories and Kowloon, Hong Kong.

    PubMed

    Zhang, Xuan; Wang, Qishan; Liu, Yanfang; Wu, Jing; Yu, Miao

    2011-02-01

    The application of different multivariate statistical techniques for the interpretation of a complex data matrix obtained during 2000-2007 from the watercourses in the Southwest New Territories and Kowloon, Hong Kong was presented in this study. The data set consisted of the analytical results of 23 parameters measured monthly at 16 different sampling sites. Hierarchical cluster analysis grouped the 12 months into two periods and the 16 sampling sites into three groups based on similarity in water quality characteristics. Discriminant analysis (DA) provided better results both temporally and spatially. DA also offered an important data reduction as it only used four parameters for temporal analysis, affording 84.2% correct assignations, and eight parameters for spatial analysis, affording 96.1% correct assignations. Principal component analysis/factor analysis identified four latent factors standing for organic pollution, industrial pollution, nonpoint pollution, and fecal pollution, respectively. KN1, KN4, KN5, and KN7 were greatly affected by organic pollution, industrial pollution, and nonpoint pollution. The main pollution sources of TN1 and TN2 were organic pollution and nonpoint pollution, respectively. Industrial pollution had high effect on TN3, TN4, TN5, and TN6.

  20. Use of statistical and GIS techniques to assess and predict concentrations of heavy metals in soils of Lahore City, Pakistan.

    PubMed

    Alam, Nayab; Ahmad, Sajid Rashid; Qadir, Abdul; Ashraf, Muhammad Imran; Lakhan, Calvin; Lakhan, V Chris

    2015-10-01

    Soils from different land use areas in Lahore City, Pakistan, were analyzed for concentrations of heavy metals-cadmium (Cd), chromium (Cr), nickel (Ni), and lead (Pb). One hundred one samples were randomly collected from six land use areas categorized as park, commercial, agricultural, residential, urban, and industrial. Each sample was analyzed in the laboratory with the tri-acid digestion method. Metal concentrations in each sample were obtained with the use of an atomic absorption spectrophotometer. The statistical techniques of analysis of variance, correlation analysis, and cluster analysis were used to analyze all data. In addition, kriging, a geostatistical procedure supported by ArcGIS, was used to model and predict the spatial concentrations of the four heavy metals-Cd, Cr, Ni, and Pb. The results demonstrated significant correlation among the heavy metals in the urban and industrial areas. The dendogram, and the results associated with the cluster analysis, indicated that the agricultural, commercial, and park areas had high concentrations of Cr, Ni, and Pb. High concentrations of Cd and Ni were also observed in the residential and industrial areas, respectively. The maximum concentrations of both Cd and Pb exceeded world toxic limit values. The kriging method demonstrated increasing spatial diffusion of both Cd and Pb concentrations throughout and beyond the Lahore City area.

  1. Assessment of Water Quality in a Subtropical Alpine Lake Using Multivariate Statistical Techniques and Geostatistical Mapping: A Case Study

    PubMed Central

    Liu, Wen-Cheng; Yu, Hwa-Lung; Chung, Chung-En

    2011-01-01

    Concerns about the water quality in Yuan-Yang Lake (YYL), a shallow, subtropical alpine lake located in north-central Taiwan, has been rapidly increasing recently due to the natural and anthropogenic pollution. In order to understand the underlying physical and chemical processes as well as their associated spatial distribution in YYL, this study analyzes fourteen physico-chemical water quality parameters recorded at the eight sampling stations during 2008–2010 by using multivariate statistical techniques and a geostatistical method. Hierarchical clustering analysis (CA) is first applied to distinguish the three general water quality patterns among the stations, followed by the use of principle component analysis (PCA) and factor analysis (FA) to extract and recognize the major underlying factors contributing to the variations among the water quality measures. The spatial distribution of the identified major contributing factors is obtained by using a kriging method. Results show that four principal components i.e., nitrogen nutrients, meteorological factor, turbidity and nitrate factors, account for 65.52% of the total variance among the water quality parameters. The spatial distribution of principal components further confirms that nitrogen sources constitute an important pollutant contribution in the YYL. PMID:21695032

  2. APPLICATION OF ADVANCED IN VITRO TECHNIQUES TO MEASURE, UNDERSTAND AND PREDICT THE KINETICS AND MECHANISMS OF XENOBIOTIC METABOLISM

    EPA Science Inventory

    We have developed a research program in metabolism that involves numerous collaborators across EPA as well as other federal and academic labs. A primary goal is to develop and apply advanced in vitro techniques to measure, understand and predict the kinetics and mechanisms of xen...

  3. Atmospheric statistics for aerospace vehicle operations

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Batts, G. W.

    1993-01-01

    Statistical analysis of atmospheric variables was performed for the Shuttle Transportation System (STS) design trade studies and the establishment of launch commit criteria. Atmospheric constraint statistics have been developed for the NASP test flight, the Advanced Launch System, and the National Launch System. The concepts and analysis techniques discussed in the paper are applicable to the design and operations of any future aerospace vehicle.

  4. Landslide detection and long-term monitoring in urban area by means of advanced interferometric techniques

    NASA Astrophysics Data System (ADS)

    Cigna, Francesca; Del Ventisette, Chiara; Liguori, Vincenzo; Casagli, Nicola

    2010-05-01

    This work aims at illustrating the potential of advanced interferometric techniques for detection and long-term monitoring of landslide ground deformations at local scale. Space-born InSAR (Synthetic Aperture Radar Interferometry) has been successfully exploited in recent years to measure ground deformations associated to processes with slow kinematics, such as landslides, tectonic motions, subsidence or volcanic activity, thanks to both the standard single-interferogram approach (centimeter accuracy) and advanced time-series analyses of long temporal radar satellite data stacks (millimeter accuracy), such as Persistent Scatterers Interferometry (PSI) techniques. In order to get a complete overview and an in-depth knowledge of an investigated landslide, InSAR satellite measures can support conventional in situ data. This methodology allows studying the spatial pattern and the temporal evolution of ground deformations, improving the spatial coverage and overcoming issues related to installation of ground-based instrumentation and data acquisition in unstable areas. Here we describe the application of the above-mentioned methodology on the test area of Agrigento, Sicily (Italy), affected by hydrogeological risk. The town is located in Southern Sicily, at edge of the Apennine-Maghrebian thrust belt, on the Plio-Pleistocene and Miocene sediments of the Gela Nappe. Ground instabilities affect the urban area and involve the infrastructures of its NW side, such as the Cathedral, the Seminary and many private buildings. An integration between InSAR analyses and conventional field investigations (e.g. structural damages and fractures surveys) was therefore carried out, to support Regional Civil Protection authorities for emergency management and risk mitigation. The results of InSAR analysis highlighted a general stability of the whole urban area between 1992 and 2007. However, very high deformation rates (up to 10-12 mm/y) were identified in 1992-2000 in the W slope of the

  5. MicroRNA changes in advanced radiotherapy techniques and its effect to secondary cancers.

    PubMed

    Sert, Fatma

    2012-09-01

    MicroRNAs (miRNAs) are a kind of RNA, produced copies of endogenous hairpin-shaped, are 21-25 nucleotide length, small, and single chain. Recent studies have revealed that hundreds of miRNAs are found in the human genome and are responsible for diverse cellular processes including the control of developmental timing, cell proliferation, apoptosis and tumorigenesis. miRNAs can activate the initiation of apoptosis, cessation of the cell cycle and aging in case of DNA damage by stimulating the tumor suppressor target gene p53 directly and indirectly. DNA damage is composed by multiple stress factors including ionizing radiation, reactive oxygen species, UV exposure and drugs like doxorubicin and camptothecin. Radiation is used widely in health, academic area, and industry for producing electricity. As a result of using radiation widely in different fields, environmental radiation exposure is increasing as well. Whereas high dose radiation exposure causes DNA damage and gives rise to ionization to molecules of living cells by accelerating malignant tumor formation. Fields receiving high dose radiation are evaluated in terms of adverse effects, therapeutic efficacy and secondary malignancies in radiotherapy applications. Dose distributions are re-created when it is required. On the other hand, fields received low dose and the doses that the patient is exposure in simulation and/or portal imaging are often overlooked. The changes in miRNA levels arising in low dose radiation field and its effect to neoplastic process in cell will be pathfinder in terms of secondary cancers or second primary cancers. It is shown that there are differences between the level changes of miRNA in low dose fields which are overlooked in daily practical applications because of not resulting with acute or chronic side effect and the level changes of miRNA in high dose fields. With the help of verifying so-called differences in low dose fields which are seen in advanced radiation techniques

  6. Analysis of Volatile Compounds by Advanced Analytical Techniques and Multivariate Chemometrics.

    PubMed

    Lubes, Giuseppe; Goodarzi, Mohammad

    2017-03-17

    Smelling is one of the five senses, which plays an important role in our everyday lives. Volatile compounds are, for example, characteristics of food where some of them can be perceivable by humans because of their aroma. They have a great influence on the decision making of consumers when they choose to use a product or not. In the case where a product has an offensive and strong aroma, many consumers might not appreciate it. On the contrary, soft and fresh natural aromas definitely increase the acceptance of a given product. These properties can drastically influence the economy; thus, it has been of great importance to manufacturers that the aroma of their food product is characterized by analytical means to provide a basis for further optimization processes. A lot of research has been devoted to this domain in order to link the quality of, e.g., a food to its aroma. By knowing the aromatic profile of a food, one can understand the nature of a given product leading to developing new products, which are more acceptable by consumers. There are two ways to analyze volatiles: one is to use human senses and/or sensory instruments, and the other is based on advanced analytical techniques. This work focuses on the latter. Although requirements are simple, low-cost technology is an attractive research target in this domain; most of the data are generated with very high-resolution analytical instruments. Such data gathered based on different analytical instruments normally have broad, overlapping sensitivity profiles and require substantial data analysis. In this review, we have addressed not only the question of the application of chemometrics for aroma analysis but also of the use of different analytical instruments in this field, highlighting the research needed for future focus.

  7. Advanced Multivariate Inversion Techniques for High Resolution 3D Geophysical Modeling (Invited)

    NASA Astrophysics Data System (ADS)

    Maceira, M.; Zhang, H.; Rowe, C. A.

    2009-12-01

    We focus on the development and application of advanced multivariate inversion techniques to generate a realistic, comprehensive, and high-resolution 3D model of the seismic structure of the crust and upper mantle that satisfies several independent geophysical datasets. Building on previous efforts of joint invesion using surface wave dispersion measurements, gravity data, and receiver functions, we have added a fourth dataset, seismic body wave P and S travel times, to the simultaneous joint inversion method. We present a 3D seismic velocity model of the crust and upper mantle of northwest China resulting from the simultaneous, joint inversion of these four data types. Surface wave dispersion measurements are primarily sensitive to seismic shear-wave velocities, but at shallow depths it is difficult to obtain high-resolution velocities and to constrain the structure due to the depth-averaging of the more easily-modeled, longer-period surface waves. Gravity inversions have the greatest resolving power at shallow depths, and they provide constraints on rock density variations. Moreover, while surface wave dispersion measurements are primarily sensitive to vertical shear-wave velocity averages, body wave receiver functions are sensitive to shear-wave velocity contrasts and vertical travel-times. Addition of the fourth dataset, consisting of seismic travel-time data, helps to constrain the shear wave velocities both vertically and horizontally in the model cells crossed by the ray paths. Incorporation of both P and S body wave travel times allows us to invert for both P and S velocity structure, capitalizing on empirical relationships between both wave types’ seismic velocities with rock densities, thus eliminating the need for ad hoc assumptions regarding the Poisson ratios. Our new tomography algorithm is a modification of the Maceira and Ammon joint inversion code, in combination with the Zhang and Thurber TomoDD (double-difference tomography) program.

  8. Advances in the regionalization approach: geostatistical techniques for estimating flood quantiles

    NASA Astrophysics Data System (ADS)

    Chiarello, Valentina; Caporali, Enrica; Matthies, Hermann G.

    2015-04-01

    The knowledge of peak flow discharges and associated floods is of primary importance in engineering practice for planning of water resources and risk assessment. Streamflow characteristics are usually estimated starting from measurements of river discharges at stream gauging stations. However, the lack of observations at site of interest as well as the measurement inaccuracies, bring inevitably to the necessity of developing predictive models. Regional analysis is a classical approach to estimate river flow characteristics at sites where little or no data exists. Specific techniques are needed to regionalize the hydrological variables over the considered area. Top-kriging or topological kriging, is a kriging interpolation procedure that takes into account the geometric organization and structure of hydrographic network, the catchment area and the nested nature of catchments. The continuous processes in space defined for the point variables are represented by a variogram. In Top-kriging, the measurements are not point values but are defined over a non-zero catchment area. Top-kriging is applied here over the geographical space of Tuscany Region, in Central Italy. The analysis is carried out on the discharge data of 57 consistent runoff gauges, recorded from 1923 to 2014. Top-kriging give also an estimation of the prediction uncertainty in addition to the prediction itself. The results are validated using a cross-validation procedure implemented in the package rtop of the open source statistical environment R The results are compared through different error measurement methods. Top-kriging seems to perform better in nested catchments and larger scale catchments but no for headwater or where there is a high variability for neighbouring catchments.

  9. The Use of Multi-Component Statistical Techniques in Understanding Subduction Zone Arc Granitic Geochemical Data Sets

    NASA Astrophysics Data System (ADS)

    Pompe, L.; Clausen, B. L.; Morton, D. M.

    2015-12-01

    Multi-component statistical techniques and GIS visualization are emerging trends in understanding large data sets. Our research applies these techniques to a large igneous geochemical data set from southern California to better understand magmatic and plate tectonic processes. A set of 480 granitic samples collected by Baird from this area were analyzed for 39 geochemical elements. Of these samples, 287 are from the Peninsular Ranges Batholith (PRB) and 164 from part of the Transverse Ranges (TR). Principal component analysis (PCA) summarized the 39 variables into 3 principal components (PC) by matrix multiplication and for the PRB are interpreted as follows: PC1 with about 30% of the variation included mainly compatible elements and SiO2 and indicates extent of differentation; PC2 with about 20% of the variation included HFS elements and may indicate crustal contamination as usually identified by Sri; PC3 with about 20% of the variation included mainly HRE elements and may indicate magma source depth as often diplayed using REE spider diagrams and possibly Sr/Y. Several elements did not fit well in any of the three components: Cr, Ni, U, and Na2O.For the PRB, the PC1 correlation with SiO2 was r=-0.85, the PC2 correlation with Sri was r=0.80, and the PC3 correlation with Gd/Yb was r=-0.76 and with Sr/Y was r=-0.66 . Extending this method to the TR, correlations were r=-0.85, -0.21, -0.06, and -0.64, respectively. A similar extent of correlation for both areas was visually evident using GIS interpolation.PC1 seems to do well at indicating differentiation index for both the PRB and TR and correlates very well with SiO2, Al2O3, MgO, FeO*, CaO, K2O, Sc, V, and Co, but poorly with Na2O and Cr. If the crustal component is represented by Sri, PC2 correlates well and less expesively with this indicator in the PRB, but not in the TR. Source depth has been related to the slope on REE spidergrams, and PC3 based on only the HREE and using the Sr/Y ratios gives a reasonable

  10. Advanced Sensing and Control Techniques to Facilitate Semi-Autonomous Decommissioning

    SciTech Connect

    Schalkoff, Robert J.

    1999-06-01

    This research is intended to advance the technology of semi-autonomous teleoperated robotics as applied to Decontamination and Decommissioning (D&D) tasks. Specifically, research leading to a prototype dual-manipulator mobile work cell is underway. This cell is supported and enhanced by computer vision, virtual reality and advanced robotics technology.

  11. An advanced technique for speciation of organic nitrogen in atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Samy, S.; Robinson, J.; Hays, M. D.

    2011-12-01

    threshold as water-soluble free AA, with an average concentration of 22 ± 9 ng m-3 (N=13). Following microwave-assisted gas phase hydrolysis, the total AA concentration in the forest environment increased significantly (70 ± 35 ng m-3) and additional compounds (methionine, isoleucine) were detected above the reporting threshold. The ability to quantify AA in aerosol samples without derivatization reduces time consuming preparation procedures while providing the advancement of selective mass determination that eliminates potential interferences associated with traditional fluorescence detection. This step forward in precise mass determination with the use of internal standardization, improves the confidence of compound identification. With the increasing focus on WSOC (including ON) characterization in the atmospheric science community, native detection by LC-MS (Q-TOF) will play a central role in determining the most direct approach to quantify an increasing fraction of the co-extracted polar organic compounds. Method application for further characterization of atmospheric ON will be discussed. Reference: Samy, S., Robinson, J., and M.D. Hays. "An Advanced LC-MS (Q-TOF) Technique for the Detection of Amino Acids in Atmospheric Aerosols", Analytical Bioanalytical Chemistry, 2011, DOI: 10.1007/s00216-011-5238-2

  12. Techniques Optimized for Reducing Instabilities in Advanced Nickel-Base Superalloys for Turbine Blades

    NASA Technical Reports Server (NTRS)

    MacKay, Rebecca A.; Locci, Ivan E.; Garg, anita; Ritzert, Frank J.

    2002-01-01

    is a three-phase constituent composed of TCP and stringers of gamma phase in a matrix of gamma prime. An incoherent grain boundary separates the SRZ from the gammagamma prime microstructure of the superalloy. The SRZ is believed to form as a result of local chemistry changes in the superalloy due to the application of the diffusion aluminide bondcoat. Locally high surface stresses also appear to promote the formation of the SRZ. Thus, techniques that change the local alloy chemistry or reduce surface stresses have been examined for their effectiveness in reducing SRZ. These SRZ-reduction steps are performed on the test specimen or the turbine blade before the bondcoat is applied. Stressrelief heat treatments developed at NASA Glenn have been demonstrated to reduce significantly the amount of SRZ that develops during subsequent high-temperature exposures. Stress-relief heat treatments reduce surface stresses by recrystallizing a thin surface layer of the superalloy. However, in alloys with very high propensities to form SRZ, stress relief heat treatments alone do not eliminate SRZ entirely. Thus, techniques that modify the local chemistry under the bondcoat have been emphasized and optimized successfully at Glenn. One such technique is carburization, which changes the local chemistry by forming submicron carbides near the surface of the superalloy. Detailed characterizations have demonstrated that the depth and uniform distribution of these carbides are enhanced when a stress relief treatment and an appropriate surface preparation are employed in advance of the carburization treatment. Even in alloys that have the propensity to develop a continuous SRZ layer beneath the diffusion zone, the SRZ has been completely eliminated or reduced to low, manageable levels when this combination of techniques is utilized. Now that the techniques to mitigate SRZ have been established at Glenn, TCP phase formation is being emphasized in ongoing work under the UEET Program. The

  13. Investigation of Advanced Dose Verification Techniques for External Beam Radiation Treatment

    NASA Astrophysics Data System (ADS)

    Asuni, Ganiyu Adeniyi

    Intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT) have been introduced in radiation therapy to achieve highly conformal dose distributions around the tumour while minimizing dose to surrounding normal tissues. These techniques have increased the need for comprehensive quality assurance tests, to verify that customized patient treatment plans are accurately delivered during treatment. in vivo dose verification, performed during treatment delivery, confirms that the actual dose delivered is the same as the prescribed dose, helping to reduce treatment delivery errors. in vivo measurements may be accomplished using entrance or exit detectors. The objective of this project is to investigate a novel entrance detector designed for in vivo dose verification. This thesis is separated into three main investigations, focusing on a prototype entrance transmission detector (TRD) developed by IBA Dosimetry, Germany. First contaminant electrons generated by the TRD in a 6 MV photon beam were investigated using Monte Carlo (MC) simulation. This study demonstrates that modification of the contaminant electron model in the treatment planning system is required for accurate patient dose calculation in buildup regions when using the device. Second, the ability of the TRD to accurately measure dose from IMRT and VMAT was investigated by characterising the spatial resolution of the device. This was accomplished by measuring the point spread function with further validation provided by MC simulation. Comparisons of measured and calculated doses show that the spatial resolution of the TRD allows for measurement of clinical IMRT fields within acceptable tolerance. Finally, a new general research tool was developed to perform MC simulations for VMAT and IMRT treatments, simultaneously tracking dose deposition in both the patient CT geometry and an arbitrary planar detector system, generalized to handle either entrance or exit orientations. It was

  14. Advanced imaging techniques II: using a compound microscope for photographing point-mount specimens

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Digital imaging technology has revolutionized the practice photographing insects for scientific study. Herein described are lighting and mounting techniques designed for imaging micro Hymenoptera. Techniques described here are applicable to all small insects, as well as other invertebrates. The ke...

  15. Advanced Techniques for Assessment of Postural and Locomotor Ataxia, Spatial Orientation, and Gaze Stability

    NASA Technical Reports Server (NTRS)

    Wall, Conrad., III

    1999-01-01

    and quantified. We are improving this situation by applying methodologies such as nonlinear orbital stability to quantify responses and by using multivariate statistical approaches to link together the responses across separate tests. In this way we can exploit the information available and increase the ability to discriminate between normal and pathological responses. Measures of stability and orientation are compared to measures such as dynamic visual acuity and with balance function tests. The responses of normal human subjects and of patients having well documented pathophysiologies are being characterized. When these studies are completed, we should have a clearer idea about normal and abnormal patterns of eye, head, and body movements during locomotion and their stability in a wide range of environments. We plan eventually to use this information to validate the efficacy of candidate neurovestibular and neuromuscular rehabilitative techniques. Some representative studies made during this year are summarized.

  16. New Advanced Fabrication Technique for Millimeter-Wave Planar Components based on Fluororesin Substrates using Graft Polymerization

    NASA Astrophysics Data System (ADS)

    Ito, Naoki; Mase, Atsushi; Kogi, Yuichiro; Seko, Noriaki; Tamada, Masao; Sakata, Eiji

    2008-06-01

    As the importance of advanced millimeter-wave diagnostics increases, a reliable and accurate fabrication technique for high-performance devices and relevant components is essential. We describe a new improved fabrication technique for millimeter-wave planar components, such as antennas using low-loss fluororesin substrates. A fragile adhesion between the copper foil and fluororesin substrate and the accuracy of the device pattern using conventional fabrication techniques have been prime suspects in the failure of the devices. In order to solve these problems, surface treatment of fluororesin films and a fabrication method using electro-fine-forming (EF2) are proposed. The peel adhesion strength between the metal and fluororesin films and the value of the dielectric constant of the fluororesin films before and after grafting are reported. A prototype antenna using conventional fluororesin substrates and grafted-poly(tetrafluoroethylene) (PTFE) films produced using the EF2 fabrication technique are also introduced.

  17. The investigation of advanced remote sensing, radiative transfer and inversion techniques for the measurement of atmospheric constituents

    NASA Technical Reports Server (NTRS)

    Deepak, Adarsh; Wang, Pi-Huan

    1985-01-01

    The research program is documented for developing space and ground-based remote sensing techniques performed during the period from December 15, 1977 to March 15, 1985. The program involved the application of sophisticated radiative transfer codes and inversion methods to various advanced remote sensing concepts for determining atmospheric constituents, particularly aerosols. It covers detailed discussions of the solar aureole technique for monitoring columnar aerosol size distribution, and the multispectral limb scattered radiance and limb attenuated radiance (solar occultation) techniques, as well as the upwelling scattered solar radiance method for determining the aerosol and gaseous characteristics. In addition, analytical models of aerosol size distribution and simulation studies of the limb solar aureole radiance technique and the variability of ozone at high altitudes during satellite sunrise/sunset events are also described in detail.

  18. Interim Guidance on the Use of SiteStat/GridStats and Other Army Corps of Engineers Statistical Techniques to Characterize Military Ranges

    EPA Pesticide Factsheets

    The purpose of this memorandum is to inform recipients of concerns regarding Army Corps of Engineers statistical techniques, provide a list of installations and FWS where SiteStat/GridStats (SS/GS) have been used, and to provide direction on communicating with the public on the use of these 'tools' by USACE.

  19. The impulse resistance welding: A new technique for joining advanced thermoplastic composite parts

    SciTech Connect

    Arias, M.; Ziegmann, G.

    1996-12-31

    Welding is a joining technique suitable for thermoplastic composites. This paper presents the development of a new, fast joining technique, which is based on the common resistance welding process. Heat is introduced by using electrical power pulses into the heating area and therefore this technique was called the Impulse Resistance Welding (IRW). The new technique will be described and discussed and the application of this technique by joining ribs to the skin of an aerodynamic spoiler part is demonstrated. The potential of an automation of the Impulse resistance welding process will be shown. Carbon fibre /PEEK (APC-2/AS4) has been selected as the material both for the skin and the rib.

  20. Groundwater quality assessment of the shallow aquifers west of the Nile Delta (Egypt) using multivariate statistical and geostatistical techniques

    NASA Astrophysics Data System (ADS)

    Masoud, Alaa A.

    2014-07-01

    Extensive urban, agricultural and industrial expansions on the western fringe of the Nile Delta of Egypt have exerted much load on the water needs and lead to groundwater quality deterioration. Documenting the spatial variation of the groundwater quality and their controlling factors is vital to ensure sustainable water management and safe use. A comprehensive dataset of 451 shallow groundwater samples were collected in 2011 and 2012. On-site field measurements of the total dissolved solids (TDS), electric conductivity (EC), pH, temperature, as well as lab-based ionic composition of the major and trace components were performed. Groundwater types were derived and the suitability for irrigation use was evaluated. Multivariate statistical techniques of factor analysis and K-means clustering were integrated with the geostatistical semi-variogram modeling for evaluating the spatial hydrochemical variations and the driving factors as well as for hydrochemical pattern recognition. Most hydrochemical parameters showed very wide ranges; TDS (201-24,400 mg/l), pH (6.72-8.65), Na+ (28.30-7774 mg/l), and Cl- (7-12,186 mg/l) suggesting complex hydrochemical processes of multiple sources. TDS violated the limit (1200 mg/l) of the Egyptian standards for drinking water quality in many localities. Extreme concentrations of Fe2+, Mn2+, Zn2+, Cu2+, Ni2+, are mostly related to their natural content in the water-bearing sediments and/or to contamination from industrial leakage. Very high nitrate concentrations exceeding the permissible limit (50 mg/l) were potentially maximized toward hydrologic discharge zones and related to wastewater leakage. Three main water types; NaCl (29%), Na2SO4 (26%), and NaHCO3 (20%), formed 75% of the groundwater dominated in the saline depressions, sloping sides of the coastal ridges of the depressions, and in the cultivated/newly reclaimed lands intensely covered by irrigation canals, respectively. Water suitability for irrigation use clarified that the

  1. [Advancement of colloidal gold chromatographic technique in screening of ochratoxin A].

    PubMed

    Zhou, Wei-lu; Wang, Yu-ting; Kong, Wei-jun; Yang, Mei-hua; Zhao, Ming; Ou-Yang, Zhen

    2015-08-01

    Ochratoxin A (OTA) is a toxic secondary metabolite mainly produced by Aspergillus and Penicillium species, existing in a variety of foodstuffs and Chinese medicines. OTA is difficult to be detected in practice because of the characteristics such as trace amounts, toxicity, existing in complex matrices. In the numerous detection technologies, colloidal gold chromatographic techniques are highly sensitive, specific, cost-effective and user-friendly, and are being used increasingly for OTA screening. Recently, with the development of aptamer technology and its application in chromatographic technique, a newly colloidal gold aptamer chromatographic technique has been developed. This review elaborates the structures and principles of both traditional and newly colloidal gold chromatographic techniques, focuses on newly colloidal gold aptamer chromatographic technique, summarizes and compares their use in rapid detection of OTA. Finally, in order to provide a reference for better research of related work, the development trends of this novel technique are prospected.

  2. Adaptations of advanced safety and reliability techniques to petroleum and other industries

    NASA Technical Reports Server (NTRS)

    Purser, P. E.

    1974-01-01

    The underlying philosophy of the general approach to failure reduction and control is presented. Safety and reliability management techniques developed in the industries which have participated in the U.S. space and defense programs are described along with adaptations to nonaerospace activities. The examples given illustrate the scope of applicability of these techniques. It is indicated that any activity treated as a 'system' is a potential user of aerospace safety and reliability management techniques.

  3. Development of heat transfer enhancement techniques for external cooling of an advanced reactor vessel

    NASA Astrophysics Data System (ADS)

    Yang, Jun

    Nucleate boiling is a well-recognized means for passively removing high heat loads (up to ˜106 W/m2) generated by a molten reactor core under severe accident conditions while maintaining relatively low reactor vessel temperature (<800 °C). With the upgrade and development of advanced power reactors, however, enhancing the nucleate boiling rate and its upper limit, Critical Heat Flux (CHF), becomes the key to the success of external passive cooling of reactor vessel undergoing core disrupture accidents. In the present study, two boiling heat transfer enhancement methods have been proposed, experimentally investigated and theoretically modelled. The first method involves the use of a suitable surface coating to enhance downward-facing boiling rate and CHF limit so as to substantially increase the possibility of reactor vessel surviving high thermal load attack. The second method involves the use of an enhanced vessel/insulation design to facilitate the process of steam venting through the annular channel formed between the reactor vessel and the insulation structure, which in turn would further enhance both the boiling rate and CHF limit. Among the various available surface coating techniques, metallic micro-porous layer surface coating has been identified as an appropriate coating material for use in External Reactor Vessel Cooling (ERVC) based on the overall consideration of enhanced performance, durability, the ease of manufacturing and application. Since no previous research work had explored the feasibility of applying such a metallic micro-porous layer surface coating on a large, downward facing and curved surface such as the bottom head of a reactor vessel, a series of characterization tests and experiments were performed in the present study to determine a suitable coating material composition and application method. Using the optimized metallic micro-porous surface coatings, quenching and steady-state boiling experiments were conducted in the Sub

  4. Advanced imaging techniques for the study of plant growth and development

    PubMed Central

    Sozzani, Rosangela; Busch, Wolfgang; Spalding, Edgar P.; Benfey, Philip N.

    2014-01-01

    A variety of imaging methodologies are being used to collect data for quantitative studies of plant growth and development from living plants. Multi-level data, from macroscopic to molecular, and from weeks to seconds, can be acquired. Furthermore, advances in parallelized and automated image acquisition enable the throughput to capture images from large populations of plants under specific growth conditions. Image-processing capabilities allow for 3D or 4D reconstruction of image data and automated quantification of biological features. These advances facilitate the integration of imaging data with genome-wide molecular data to enable systems-level modeling. PMID:24434036

  5. Advanced imaging techniques for the study of plant growth and development.

    PubMed

    Sozzani, Rosangela; Busch, Wolfgang; Spalding, Edgar P; Benfey, Philip N

    2014-05-01

    A variety of imaging methodologies are being used to collect data for quantitative studies of plant growth and development from living plants. Multi-level data, from macroscopic to molecular, and from weeks to seconds, can be acquired. Furthermore, advances in parallelized and automated image acquisition enable the throughput to capture images from large populations of plants under specific growth conditions. Image-processing capabilities allow for 3D or 4D reconstruction of image data and automated quantification of biological features. These advances facilitate the integration of imaging data with genome-wide molecular data to enable systems-level modeling.

  6. Topology for statistical modeling of petascale data.

    SciTech Connect

    Pascucci, Valerio; Mascarenhas, Ajith Arthur; Rusek, Korben; Bennett, Janine Camille; Levine, Joshua; Pebay, Philippe Pierre; Gyulassy, Attila; Thompson, David C.; Rojas, Joseph Maurice

    2011-07-01

    This document presents current technical progress and dissemination of results for the Mathematics for Analysis of Petascale Data (MAPD) project titled 'Topology for Statistical Modeling of Petascale Data', funded by the Office of Science Advanced Scientific Computing Research (ASCR) Applied Math program. Many commonly used algorithms for mathematical analysis do not scale well enough to accommodate the size or complexity of petascale data produced by computational simulations. The primary goal of this project is thus to develop new mathematical tools that address both the petascale size and uncertain nature of current data. At a high level, our approach is based on the complementary techniques of combinatorial topology and statistical modeling. In particular, we use combinatorial topology to filter out spurious data that would otherwise skew statistical modeling techniques, and we employ advanced algorithms from algebraic statistics to efficiently find globally optimal fits to statistical models. This document summarizes the technical advances we have made to date that were made possible in whole or in part by MAPD funding. These technical contributions can be divided loosely into three categories: (1) advances in the field of combinatorial topology, (2) advances in statistical modeling, and (3) new integrated topological and statistical methods.

  7. Techniques to Assess and Mitigate the Environmental Risk Posed by use of Airguns: Recent Advances from Academic Research Programs

    NASA Astrophysics Data System (ADS)

    Miller, P. J.; Tyack, P. L.; Johnson, M. P.; Madsen, P. T.; King, R.

    2006-05-01

    There is considerable uncertainty about the ways in which marine mammals might react to noise, the biological significance of reactions, and the effectiveness of planning and real-time mitigation techniques. A planning tool commonly used to assess environmental risk of acoustic activities uses simulations to predict acoustic exposures received by animals, and translates exposure to response using a dose-response function to yield an estimate of the undesired impact on a population. Recent advances show promise to convert this planning tool into a real-time mitigation tool, using Bayesian statistical methods. In this approach, being developed for use by the British Navy, the environmental risk simulation is updated continuously during field operations. The distribution of exposure, set initially based on animal density, is updated in real-time using animal sensing data or environmental data known to correlate with the absence or presence of marine mammals. This conditional probability of animal presence should therefore be more accurate than prior probabilities used during planning, which enables a more accurate and quantitative assessment of both the impact of activities and reduction of impact via mitigation decisions. Two key areas of uncertainty in addition to animal presence/absence are 1.) how biologically-relevant behaviours are affected by exposure to noise, and 2.) whether animals avoid loud noise sources, which is the basis of ramp-up as a mitigation tool. With support from MMS and industry partners, we assessed foraging behaviour and avoidance movements of 8 tagged sperm whales in the Gulf of Mexico during experimental exposure to airguns. The whale that was approached most closely prolonged a surface resting bout hours longer than typical, but resumed foraging immediately after the airguns ceased, suggesting avoidance of deep diving necessary for foraging near active airguns. Behavioral indices of foraging rate (echolocation buzzes produced during prey

  8. Advances in CIS devices fabricated by a non-vacuum technique

    SciTech Connect

    Leidholm, C.R.; Norsworthy, G.A.; Roe, R.; Halani, A.; Basol, B.M.; Kapur, V.K.

    1999-03-01

    A novel, non-vacuum technique based on nano-particle deposition has been developed for the formation of CIS-type solar cell absorbers. Solar cells with {gt}12{percent} efficiency were previously demonstrated using this technique. Improvements in module integration processes have recently yielded 8{percent} minimodules of 75 cm{sup 2} area. {copyright} {ital 1999 American Institute of Physics.}

  9. Advanced techniques for the measurement of multiple recombination parameters in solar cells

    NASA Technical Reports Server (NTRS)

    Newhouse, M.; Wolf, M.

    1985-01-01

    A survey of bulk recombination measurement techniques was presented. Classical methods were reviewed along with their limiting assumptions and simplifications. A modulated light measurement system was built and showed the large effects of junction capacitance. Techniques for extension of classical methods for measurement of multiparameter multiregression measurements were identified and analyzed.

  10. Advance Appropriations: A Needless and Confusing Education Budget Technique. Federal Education Budget Project

    ERIC Educational Resources Information Center

    Delisle, Jason

    2007-01-01

    This report argues that advance appropriations serve no functional purpose for schools, but they create a loss of transparency, comparability, and simplicity in federal education budgeting. It allocates spending before future budgets have been established. The approach was originally used to skirt spending limits and budget procedures in place…

  11. The Advance Organizer: A Review of Research Using Glass's Technique of Meta-Analysis.

    ERIC Educational Resources Information Center

    Luiten, John; And Others

    Using Glass's meta-analysis, of which "effect size" is the fundamental measure, 135 research studies on Ausubel's advance organizer theory were reviewed to determine its effect on learning and retention. Variables, such as grade level, subject area, organizer presentation mode, and ability level were also examined. In most of these…

  12. Advanced karst hydrological and contaminant monitoring techniques for real-time and high resolution applications

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In telogenetic and soil-mantled karst aquifers, the movement of autogenic recharge through the epikarstic zone and into the regional aquifer can be a complex process and have implications for flooding, groundwater contamination, and other difficult to capture processes. Recent advances in instrument...

  13. Recent advances in electronic nose techniques for monitoring of fermentation process.

    PubMed

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  14. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  15. An overview on in situ micronization technique – An emerging novel concept in advanced drug delivery

    PubMed Central

    Vandana, K.R.; Prasanna Raju, Y.; Harini Chowdary, V.; Sushma, M.; Vijay Kumar, N.

    2013-01-01

    The use of drug powders containing micronized drug particles has been increasing in several pharmaceutical dosage forms to overcome the dissolution and bioavailability problems. Most of the newly developed drugs are poorly water soluble which limits dissolution rate and bioavailability. The dissolution rate can be enhanced by micronization of the drug particles. The properties of the micronized drug substance such as particle size, size distribution, shape, surface properties, and agglomeration behaviour and powder flow are affected by the type of micronization technique used. Mechanical communition, spray drying and supercritical fluid (SCF) technology are the most commonly employed techniques for production of micronized drug particles but the characteristics of the resulting drug product cannot be controlled using these techniques. Hence, a newer technique called in situ micronization is developed in order to overcome the limitations associated with the other techniques. This review summarizes the existing knowledge on in situ micronization techniques. The properties of the resulting drug substance obtained by in situ micronization were also compared. PMID:25161371

  16. Assessment of recent advances in measurement techniques for atmospheric carbon dioxide and methane observations

    NASA Astrophysics Data System (ADS)

    Zellweger, Christoph; Emmenegger, Lukas; Firdaus, Mohd; Hatakka, Juha; Heimann, Martin; Kozlova, Elena; Spain, T. Gerard; Steinbacher, Martin; van der Schoot, Marcel V.; Buchmann, Brigitte

    2016-09-01

    Until recently, atmospheric carbon dioxide (CO2) and methane (CH4) measurements were made almost exclusively using nondispersive infrared (NDIR) absorption and gas chromatography with flame ionisation detection (GC/FID) techniques, respectively. Recently, commercially available instruments based on spectroscopic techniques such as cavity ring-down spectroscopy (CRDS), off-axis integrated cavity output spectroscopy (OA-ICOS) and Fourier transform infrared (FTIR) spectroscopy have become more widely available and affordable. This resulted in a widespread use of these techniques at many measurement stations. This paper is focused on the comparison between a CRDS "travelling instrument" that has been used during performance audits within the Global Atmosphere Watch (GAW) programme of the World Meteorological Organization (WMO) with instruments incorporating other, more traditional techniques for measuring CO2 and CH4 (NDIR and GC/FID). We demonstrate that CRDS instruments and likely other spectroscopic techniques are suitable for WMO/GAW stations and allow a smooth continuation of historic CO2 and CH4 time series. Moreover, the analysis of the audit results indicates that the spectroscopic techniques have a number of advantages over the traditional methods which will lead to the improved accuracy of atmospheric CO2 and CH4 measurements.

  17. Advances in atmospheric light scattering theory and remote-sensing techniques

    NASA Astrophysics Data System (ADS)

    Videen, Gorden; Sun, Wenbo; Gong, Wei

    2017-02-01

    This issue focuses especially on characterizing particles in the Earth-atmosphere system. The significant role of aerosol particles in this system was recognized in the mid-1970s [1]. Since that time, our appreciation for the role they play has only increased. It has been and continues to be one of the greatest unknown factors in the Earth-atmosphere system as evidenced by the most recent Intergovernmental Panel on Climate Change (IPCC) assessments [2]. With increased computational capabilities, in terms of both advanced algorithms and in brute-force computational power, more researchers have the tools available to address different aspects of the role of aerosols in the atmosphere. In this issue, we focus on recent advances in this topical area, especially the role of light scattering and remote sensing. This issue follows on the heels of four previous topical issues on this subject matter that have graced the pages of this journal [3-6].

  18. External Magnetic Field Reduction Techniques for the Advanced Stirling Radioisotope Generator

    NASA Technical Reports Server (NTRS)

    Niedra, Janis M.; Geng, Steven M.

    2013-01-01

    Linear alternators coupled to high efficiency Stirling engines are strong candidates for thermal-to-electric power conversion in space. However, the magnetic field emissions, both AC and DC, of these permanent magnet excited alternators can interfere with sensitive instrumentation onboard a spacecraft. Effective methods to mitigate the AC and DC electromagnetic interference (EMI) from solenoidal type linear alternators (like that used in the Advanced Stirling Convertor) have been developed for potential use in the Advanced Stirling Radioisotope Generator. The methods developed avoid the complexity and extra mass inherent in data extraction from multiple sensors or the use of shielding. This paper discusses these methods, and also provides experimental data obtained during breadboard testing of both AC and DC external magnetic field devices.

  19. Advanced techniques for the storage and use of very large, heterogeneous spatial databases

    NASA Technical Reports Server (NTRS)

    Peuquet, Donna J.

    1987-01-01

    Progress is reported in the development of a prototype knowledge-based geographic information system. The overall purpose of this project is to investigate and demonstrate the use of advanced methods in order to greatly improve the capabilities of geographic information system technology in the handling of large, multi-source collections of spatial data in an efficient manner, and to make these collections of data more accessible and usable for the Earth scientist.

  20. Advances in projection of climate change impacts using supervised nonlinear dimensionality reduction techniques

    NASA Astrophysics Data System (ADS)

    Sarhadi, Ali; Burn, Donald H.; Yang, Ge; Ghodsi, Ali

    2017-02-01

    One of the main challenges in climate change studies is accurate projection of the global warming impacts on the probabilistic behaviour of hydro-climate processes. Due to the complexity of climate-associated processes, identification of predictor variables from high dimensional atmospheric variables is considered a key factor for improvement of climate change projections in statistical downscaling approaches. For this purpose, the present paper adopts a new approach of supervised dimensionality reduction, which is called "Supervised Principal Component Analysis (Supervised PCA)" to regression-based statistical downscaling. This method is a generalization of PCA, extracting a sequence of principal components of atmospheric variables, which have maximal dependence on the response hydro-climate variable. To capture the nonlinear variability between hydro-climatic response variables and projectors, a kernelized version of Supervised PCA is also applied for nonlinear dimensionality reduction. The effectiveness of the Supervised PCA methods in comparison with some state-of-the-art algorithms for dimensionality reduction is evaluated in relation to the statistical downscaling process of precipitation in a specific site using two soft computing nonlinear machine learning methods, Support Vector Regression and Relevance Vector Machine. The results demonstrate a significant improvement over Supervised PCA methods in terms of performance accuracy.