Sample records for analysis techniques include

  1. Factor Analysis and Counseling Research

    ERIC Educational Resources Information Center

    Weiss, David J.

    1970-01-01

    Topics discussed include factor analysis versus cluster analysis, analysis of Q correlation matrices, ipsativity and factor analysis, and tests for the significance of a correlation matrix prior to application of factor analytic techniques. Techniques for factor extraction discussed include principal components, canonical factor analysis, alpha…

  2. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  3. Advances in the analysis and design of constant-torque springs

    NASA Technical Reports Server (NTRS)

    McGuire, John R.; Yura, Joseph A.

    1996-01-01

    In order to improve the design procedure of constant-torque springs used in aerospace applications, several new analysis techniques have been developed. These techniques make it possible to accurately construct a torque-rotation curve for any general constant-torque spring configuration. These new techniques allow for friction in the system to be included in the analysis, an area of analysis that has heretofore been unexplored. The new analysis techniques also include solutions for the deflected shape of the spring as well as solutions for drum and roller support reaction forces. A design procedure incorporating these new capabilities is presented.

  4. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  5. Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory

    NASA Astrophysics Data System (ADS)

    Steels, R. S., Jr.; Babelay, E. F., Jr.

    1980-07-01

    Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.

  6. Analysis of Synthetic Polymers.

    ERIC Educational Resources Information Center

    Smith, Charles G.; And Others

    1989-01-01

    Reviews techniques for the characterization and analysis of synthetic polymers, copolymers, and blends. Includes techniques for structure determination, separation, and quantitation of additives and residual monomers; determination of molecular weight; and the study of thermal properties including degradation mechanisms. (MVL)

  7. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  8. Computer-assisted techniques to evaluate fringe patterns

    NASA Astrophysics Data System (ADS)

    Sciammarella, Cesar A.; Bhat, Gopalakrishna K.

    1992-01-01

    Strain measurement using interferometry requires an efficient way to extract the desired information from interferometric fringes. Availability of digital image processing systems makes it possible to use digital techniques for the analysis of fringes. In the past, there have been several developments in the area of one dimensional and two dimensional fringe analysis techniques, including the carrier fringe method (spatial heterodyning) and the phase stepping (quasi-heterodyning) technique. This paper presents some new developments in the area of two dimensional fringe analysis, including a phase stepping technique supplemented by the carrier fringe method and a two dimensional Fourier transform method to obtain the strain directly from the discontinuous phase contour map.

  9. Technologies for Clinical Diagnosis Using Expired Human Breath Analysis

    PubMed Central

    Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji

    2015-01-01

    This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142

  10. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  11. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  12. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  13. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  14. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  15. SEP thrust subsystem performance sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Atkins, K. L.; Sauer, C. G., Jr.; Kerrisk, D. J.

    1973-01-01

    This is a two-part report on solar electric propulsion (SEP) performance sensitivity analysis. The first part describes the preliminary analysis of the SEP thrust system performance for an Encke rendezvous mission. A detailed description of thrust subsystem hardware tolerances on mission performance is included together with nominal spacecraft parameters based on these tolerances. The second part describes the method of analysis and graphical techniques used in generating the data for Part 1. Included is a description of both the trajectory program used and the additional software developed for this analysis. Part 2 also includes a comprehensive description of the use of the graphical techniques employed in this performance analysis.

  16. Reduced-Smoke Solid Propellant Combustion Products Analysis. Development of a Micromotor Combustor Technique.

    DTIC Science & Technology

    1976-10-01

    A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of

  17. Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.

    PubMed

    Sisco, Edward; Dake, Jeffrey; Bridge, Candice

    2013-10-10

    Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  19. A manual for inexpensive methods of analyzing and utilizing remote sensor data

    NASA Technical Reports Server (NTRS)

    Elifrits, C. D.; Barr, D. J.

    1978-01-01

    Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.

  20. Theoretical and software considerations for nonlinear dynamic analysis

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1983-01-01

    In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.

  1. Advanced analysis technique for the evaluation of linear alternators and linear motors

    NASA Technical Reports Server (NTRS)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  2. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  3. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  4. Non-destructive evaluation techniques, high temperature ceramic component parts for gas turbines

    NASA Technical Reports Server (NTRS)

    Reiter, H.; Hirsekorn, S.; Lottermoser, J.; Goebbels, K.

    1984-01-01

    This report concerns studies conducted on various tests undertaken on material without destroying the material. Tests included: microradiographic techniques, vibration analysis, high-frequency ultrasonic tests with the addition of evaluation of defects and structure through analysis of ultrasonic scattering data, microwave tests and analysis of sound emission.

  5. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  6. "Soft"or "hard" ionisation? Investigation of metastable gas temperature effect on direct analysis in real-time analysis of Voriconazole.

    PubMed

    Lapthorn, Cris; Pullen, Frank

    2009-01-01

    The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.

  7. Prospects for laser-induced breakdown spectroscopy for biomedical applications: a review.

    PubMed

    Singh, Vivek Kumar; Rai, Awadhesh Kumar

    2011-09-01

    We review the different spectroscopic techniques including the most recent laser-induced breakdown spectroscopy (LIBS) for the characterization of materials in any phase (solid, liquid or gas) including biological materials. A brief history of the laser and its application in bioscience is presented. The development of LIBS, its working principle and its instrumentation (different parts of the experimental set up) are briefly summarized. The generation of laser-induced plasma and detection of light emitted from this plasma are also discussed. The merit and demerits of LIBS are discussed in comparison with other conventional analytical techniques. The work done using the laser in the biomedical field is also summarized. The analysis of different tissues, mineral analysis in different organs of the human body, characterization of different types of stone formed in the human body, analysis of biological aerosols using the LIBS technique are also summarized. The unique abilities of LIBS including detection of molecular species and calibration-free LIBS are compared with those of other conventional techniques including atomic absorption spectroscopy, inductively coupled plasma atomic emission spectroscopy and mass spectroscopy, and X-ray fluorescence.

  8. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  9. Sample preparation for the analysis of isoflavones from soybeans and soy foods.

    PubMed

    Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A

    2009-01-02

    This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.

  10. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  11. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  12. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  13. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  14. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  15. Analysis of Gold Ores by Fire Assay

    ERIC Educational Resources Information Center

    Blyth, Kristy M.; Phillips, David N.; van Bronswijk, Wilhelm

    2004-01-01

    Students of an Applied Chemistry degree course carried out a fire-assay exercise. The analysis showed that the technique was a worthwhile quantitative analytical technique and covered interesting theory including acid-base and redox chemistry and other concepts such as inquarting and cupelling.

  16. Capillary electrophoresis for drug analysis

    NASA Astrophysics Data System (ADS)

    Lurie, Ira S.

    1999-02-01

    Capillary electrophoresis (CE) is a high resolution separation technique which is amenable to a wide variety of solutes, including compounds which are thermally degradable, non-volatile and highly polar, and is therefore well suited for drug analysis. Techniques which have been used in our laboratory include electrokinetic chromatography (ECC), free zone electrophoresis (CZE) and capillary electrochromatography (CEC). ECC, which uses a charged run buffer additive which migrates counter to osmotic flow, is excellent for many applications, including, drug screening and analyses of heroin, cocaine and methamphetamine samples. ECC approaches include the use of micelles and charged cyclodextrins, which allow for the separation of complex mixtures. Simultaneous separation of acidic, neutral and basic solutes and the resolution of optical isomers and positional isomers are possible. CZE has been used for the analysis of small ions (cations and anions) in heroin exhibits. For the ECC and CZE experiments performed in our laboratory, uncoated capillaries were used. In contrast, CEC uses capillaries packed with high performance liquid chromatography stationary phases, and offers both high peak capacities and unique selectivities. Applications include the analysis of cannabinoids and drug screening. Although CE suffers from limited concentration sensitivity, it is still applicable to trace analysis of drug samples, especially when using injection techniques such as stacking, or detection schemes such as laser induced fluorescence and extended pathlength UV.

  17. Low-thrust chemical propulsion system propellant expulsion and thermal conditioning study. Executive summary

    NASA Technical Reports Server (NTRS)

    Merino, F.; Wakabayashi, I.; Pleasant, R. L.; Hill, M.

    1982-01-01

    Preferred techniques for providing abort pressurization and engine feed system net positive suction pressure (NPSP) for low thrust chemical propulsion systems (LTPS) were determined. A representative LTPS vehicle configuration is presented. Analysis tasks include: propellant heating analysis; pressurant requirements for abort propellant dump; and comparative analysis of pressurization techniques and thermal subcoolers.

  18. Cost considerations in using simulations for medical training.

    PubMed

    Fletcher, J D; Wind, Alexander P

    2013-10-01

    This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  19. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  20. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  1. Solid State Audio/Speech Processor Analysis.

    DTIC Science & Technology

    1980-03-01

    techniques. The techniques were demonstrated to be worthwhile in an efficient realtime AWR system. Finally, microprocessor architectures were designed to...do not include custom chip development, detailed hardware design , construction or testing. ITTDCD is very encouraged by the results obtained in this...California, Berkley, was responsible for furnishing the simulation data of OD speech analysis techniques and for the design and development of the hardware OD

  2. Programmable Logic Application Notes

    NASA Technical Reports Server (NTRS)

    Katz, Richard

    2000-01-01

    This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will continue a series of notes concentrating on analysis techniques with this issue's section discussing: Digital Timing Analysis Tools and Techniques. Articles in this issue include: SX and SX-A Series Devices Power Sequencing; JTAG and SXISX-AISX-S Series Devices; Analysis Techniques (i.e., notes on digital timing analysis tools and techniques); Status of the Radiation Hard reconfigurable Field Programmable Gate Array Program, Input Transition Times; Apollo Guidance Computer Logic Study; RT54SX32S Prototype Data Sets; A54SX32A - 0.22 micron/UMC Test Results; Ramtron FM1608 FRAM; and Analysis of VHDL Code and Synthesizer Output.

  3. Physical vs. photolithographic patterning of plasma polymers: an investigation by ToF-SSIMS and multivariate analysis

    PubMed Central

    Mishra, Gautam; Easton, Christopher D.; McArthur, Sally L.

    2009-01-01

    Physical and photolithographic techniques are commonly used to create chemical patterns for a range of technologies including cell culture studies, bioarrays and other biomedical applications. In this paper, we describe the fabrication of chemical micropatterns from commonly used plasma polymers. Atomic force microcopy (AFM) imaging, Time-of-Flight Static Secondary Ion Mass Spectrometry (ToF-SSIMS) imaging and multivariate analysis have been employed to visualize the chemical boundaries created by these patterning techniques and assess the spatial and chemical resolution of the patterns. ToF-SSIMS analysis demonstrated that well defined chemical and spatial boundaries were obtained from photolithographic patterning, while the resolution of physical patterning via a transmission electron microscopy (TEM) grid varied depending on the properties of the plasma system including the substrate material. In general, physical masking allowed diffusion of the plasma species below the mask and bleeding of the surface chemistries. Multivariate analysis techniques including Principal Component Analysis (PCA) and Region of Interest (ROI) assessment were used to investigate the ToF-SSIMS images of a range of different plasma polymer patterns. In the most challenging case, where two strongly reacting polymers, allylamine and acrylic acid were deposited, PCA confirmed the fabrication of micropatterns with defined spatial resolution. ROI analysis allowed for the identification of an interface between the two plasma polymers for patterns fabricated using the photolithographic technique which has been previously overlooked. This study clearly demonstrated the versatility of photolithographic patterning for the production of multichemistry plasma polymer arrays and highlighted the need for complimentary characterization and analytical techniques during the fabrication plasma polymer micropatterns. PMID:19950941

  4. Radionuclides in Diagnosis.

    ERIC Educational Resources Information Center

    Williams, E. D.

    1989-01-01

    Discussed is a radionuclide imaging technique, including the gamma camera, image analysis computer, radiopharmaceuticals, and positron emission tomography. Several pictures showing the use of this technique are presented. (YP)

  5. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  6. Transgender Phonosurgery: A Systematic Review and Meta-analysis.

    PubMed

    Song, Tara Elena; Jiang, Nancy

    2017-05-01

    Objectives Different surgical techniques have been described in the literature to increase vocal pitch. The purpose of this study is to systematically review these surgeries and perform a meta-analysis to determine which technique increases pitch the most. Data Sources CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct. Review Methods A systematic review and meta-analysis of the literature was performed using the CINAHL, Cochrane, Embase, Medline, PubMed, and Science Direct databases. Studies were eligible for inclusion if they evaluated pitch-elevating phonosurgical techniques in live humans and performed pre- and postoperative acoustic analysis. Data were gathered regarding surgical technique, pre- and postoperative fundamental frequencies, perioperative care measures, and complications. Results Twenty-nine studies were identified. After applying inclusion and exclusion criteria, a total of 13 studies were included in the meta-analysis. Mechanisms of pitch elevation included increasing vocal cord tension (cricothyroid approximation), shortening the vocal cord length (cold knife glottoplasty, laser-shortening glottoplasty), and decreasing mass (laser reduction glottoplasty). The most common interventions were shortening techniques and cricothyroid approximation (6 studies each). The largest increase in fundamental frequency was seen with techniques that shortened the vocal cords. Preoperative speech therapy, postoperative voice rest, and reporting of patient satisfaction were inconsistent. Many of the studies were limited by low power and short length of follow-up. Conclusions Multiple techniques for elevation of vocal pitch exist, but vocal cord shortening procedures appear to result in the largest increase in fundamental frequency.

  7. Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle

    NASA Technical Reports Server (NTRS)

    Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero

    2000-01-01

    This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.

  8. Laboratory Spectrometer for Wear Metal Analysis of Engine Lubricants.

    DTIC Science & Technology

    1986-04-01

    analysis, the acid digestion technique for sample pretreatment is the best approach available to date because of its relatively large sample size (1000...microliters or more). However, this technique has two major shortcomings limiting its application: (1) it requires the use of hydrofluoric acid (a...accuracy. Sample preparation including filtration or acid digestion may increase analysis times by 20 minutes or more. b. Repeatability In the analysis

  9. Multidimensional chromatography in food analysis.

    PubMed

    Herrero, Miguel; Ibáñez, Elena; Cifuentes, Alejandro; Bernal, Jose

    2009-10-23

    In this work, the main developments and applications of multidimensional chromatographic techniques in food analysis are reviewed. Different aspects related to the existing couplings involving chromatographic techniques are examined. These couplings include multidimensional GC, multidimensional LC, multidimensional SFC as well as all their possible combinations. Main advantages and drawbacks of each coupling are critically discussed and their key applications in food analysis described.

  10. Science in Drama: Using Television Programmes to Teach Concepts and Techniques

    ERIC Educational Resources Information Center

    Rutter, Gordon

    2011-01-01

    By using a specific episode of the popular television cartoon series "The Simpsons," a range of techniques can be communicated, including microscope setup and use, simple chemical analysis, observation, and interpretation. Knowledge of blood groups and typing, morphological comparison of hair samples, fingerprint analysis, and DNA fingerprinting…

  11. High resolution frequency analysis techniques with application to the redshift experiment

    NASA Technical Reports Server (NTRS)

    Decher, R.; Teuber, D.

    1975-01-01

    High resolution frequency analysis methods, with application to the gravitational probe redshift experiment, are discussed. For this experiment a resolution of .00001 Hz is required to measure a slowly varying, low frequency signal of approximately 1 Hz. Major building blocks include fast Fourier transform, discrete Fourier transform, Lagrange interpolation, golden section search, and adaptive matched filter technique. Accuracy, resolution, and computer effort of these methods are investigated, including test runs on an IBM 360/65 computer.

  12. Wood lens design philosophy based on a binary additive manufacturing technique

    NASA Astrophysics Data System (ADS)

    Marasco, Peter L.; Bailey, Christopher

    2016-04-01

    Using additive manufacturing techniques in optical engineering to construct a gradient index (GRIN) optic may overcome a number of limitations of GRIN technology. Such techniques are maturing quickly, yielding additional design degrees of freedom for the engineer. How best to employ these degrees of freedom is not completely clear at this time. This paper describes a preliminary design philosophy, including assumptions, pertaining to a particular printing technique for GRIN optics. It includes an analysis based on simulation and initial component measurement.

  13. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  14. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.

  15. X-Ray Microanalysis and Electron Energy Loss Spectrometry in the Analytical Electron Microscope: Review and Future Directions

    NASA Technical Reports Server (NTRS)

    Goldstein, J. I.; Williams, D. B.

    1992-01-01

    This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.

  16. Some failure modes and analysis techniques for terrestrial solar cell modules

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Stern, K. H.

    1978-01-01

    Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.

  17. 37 CFR 351.10 - Evidence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...” include still photographs, video tapes, and motion pictures. (2) Separation of irrelevant portions... considered in the analysis, the techniques of data collection, the techniques of estimation and testing, and...

  18. 37 CFR 351.10 - Evidence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...” include still photographs, video tapes, and motion pictures. (2) Separation of irrelevant portions... considered in the analysis, the techniques of data collection, the techniques of estimation and testing, and...

  19. 37 CFR 351.10 - Evidence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...” include still photographs, video tapes, and motion pictures. (2) Separation of irrelevant portions... considered in the analysis, the techniques of data collection, the techniques of estimation and testing, and...

  20. 37 CFR 351.10 - Evidence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...” include still photographs, video tapes, and motion pictures. (2) Separation of irrelevant portions... considered in the analysis, the techniques of data collection, the techniques of estimation and testing, and...

  1. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  2. Analysis of thin plates with holes by using exact geometrical representation within XFEM.

    PubMed

    Perumal, Logah; Tso, C P; Leng, Lim Thong

    2016-05-01

    This paper presents analysis of thin plates with holes within the context of XFEM. New integration techniques are developed for exact geometrical representation of the holes. Numerical and exact integration techniques are presented, with some limitations for the exact integration technique. Simulation results show that the proposed techniques help to reduce the solution error, due to the exact geometrical representation of the holes and utilization of appropriate quadrature rules. Discussion on minimum order of integration order needed to achieve good accuracy and convergence for the techniques presented in this work is also included.

  3. Accuracy of trace element determinations in alternate fuels

    NASA Technical Reports Server (NTRS)

    Greenbauer-Seng, L. A.

    1980-01-01

    A review of the techniques used at Lewis Research Center (LeRC) in trace metals analysis is presented, including the results of Atomic Absorption Spectrometry and DC Arc Emission Spectrometry of blank levels and recovery experiments for several metals. The design of an Interlaboratory Study conducted by LeRC is presented. Several factors were investigated, including: laboratory, analytical technique, fuel type, concentration, and ashing additive. Conclusions drawn from the statistical analysis will help direct research efforts toward those areas most responsible for the poor interlaboratory analytical results.

  4. Position and speed control of brushless DC motors using sensorless techniques and application trends.

    PubMed

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.

  5. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    PubMed

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P < 0.0001). Patients receiving the piezoelectric technique had less swelling at postoperative days 1, 3, 5, and 7 (all Ps ≤0.023). Additionally, there was a trend of less postoperative pain and trismus in the piezosurgery groups.The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data.Our meta-analysis indicates that although patients undergoing piezosurgery experienced longer surgery time, they had less postoperative swelling, indicating that piezosurgery is a promising alternative technique for extraction of impacted third molars.

  6. Interest rate next-day variation prediction based on hybrid feedforward neural network, particle swarm optimization, and multiresolution techniques

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2016-02-01

    Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.

  7. A Market-oriented Approach To Maximizing Product Benefits: Cases in U.S. Forest Products Industries

    Treesearch

    Vijay S. Reddy; Robert J. Bush; Ronen Roudik

    1996-01-01

    Conjoint analysis, a decompositional customer preference modelling technique, has seen little application to forest products. However, the technique provides useful information for marketing decisions by quantifying consumer preference functions for multiattribute product alternatives. The results of a conjoint analysis include the contribution of each attribute and...

  8. Matrix Perturbation Techniques in Structural Dynamics

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1973-01-01

    Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.

  9. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  10. Applications of liquid-based separation in conjunction with mass spectrometry to the analysis of forensic evidence.

    PubMed

    Moini, Mehdi

    2018-05-01

    In the past few years, there has been a significant effort by the forensic science community to develop new scientific techniques for the analysis of forensic evidence. Forensic chemists have been spearheaded to develop information-rich confirmatory technologies and techniques and apply them to a broad array of forensic challenges. The purpose of these confirmatory techniques is to provide alternatives to presumptive techniques that rely on data such as color changes, pattern matching, or retention time alone, which are prone to more false positives. To this end, the application of separation techniques in conjunction with mass spectrometry has played an important role in the analysis of forensic evidence. Moreover, in the past few years the role of liquid separation techniques, such as liquid chromatography and capillary electrophoresis in conjunction with mass spectrometry, has gained significant tractions and have been applied to a wide range of chemicals, from small molecules such as drugs and explosives, to large molecules such as proteins. For example, proteomics and peptidomics have been used for identification of humans, organs, and bodily fluids. A wide range of HPLC techniques including reversed phase, hydrophilic interaction, mixed-mode, supercritical fluid, multidimensional chromatography, and nanoLC, as well as several modes of capillary electrophoresis mass spectrometry, including capillary zone electrophoresis, partial filling, full filling, and micellar electrokenetic chromatography have been applied to the analysis drugs, explosives, and questioned documents. In this article, we review recent (2015-2017) applications of liquid separation in conjunction with mass spectrometry to the analysis of forensic evidence. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Design approach of an aquaculture cage system for deployment in the constructed channel flow environments of a power plant

    PubMed Central

    Lee, Jihoon; Fredriksson, David W.; DeCew, Judson; Drach, Andrew; Yim, Solomon C.

    2018-01-01

    This study provides an engineering approach for designing an aquaculture cage system for use in constructed channel flow environments. As sustainable aquaculture has grown globally, many novel techniques have been introduced such as those implemented in the global Atlantic salmon industry. The advent of several highly sophisticated analysis software systems enables the development of such novel engineering techniques. These software systems commonly include three-dimensional (3D) drafting, computational fluid dynamics, and finite element analysis. In this study, a combination of these analysis tools is applied to evaluate a conceptual aquaculture system for potential deployment in a power plant effluent channel. The channel is supposedly clean; however, it includes elevated water temperatures and strong currents. The first portion of the analysis includes the design of a fish cage system with specific net solidities using 3D drafting techniques. Computational fluid dynamics is then applied to evaluate the flow reduction through the system from the previously generated solid models. Implementing the same solid models, a finite element analysis is performed on the critical components to assess the material stresses produced by the drag force loads that are calculated from the fluid velocities. PMID:29897954

  12. Sample detection and analysis techniques for electrophoretic separation

    NASA Technical Reports Server (NTRS)

    Falb, R. D.; Hughes, K. E.; Powell, T. R.

    1975-01-01

    Methods for detecting and analyzing biological agents suitable for space flight operations were studied primarily by literature searches which were conducted of cell separation techniques. Detection methods discussed include: photometrometric, electric, radiometric, micrometry, ultrasonic, microscopic, and photographic. A bibliography, and a directory of vendors are included along with an index of commercial hardware.

  13. DNA-PCR analysis of bloodstains sampled by the polyvinyl-alcohol method.

    PubMed

    Schyma, C; Huckenbeck, W; Bonte, W

    1999-01-01

    Among the usual techniques of sampling gunshot residues (GSR), the polyvinyl-alcohol method (PVAL) includes the advantage of embedding all particles, foreign bodies and stains on the surface of the shooter's hand in exact and reproducible topographic localization. The aim of the present study on ten persons killed by firearms was to check the possibility of DNA-PCR typing of blood traces embedded in the PVAL gloves in a second step following GSR analysis. The results of these examinations verify that the PVAL technique does not include factors that inhibit successful PCR typing. Thus the PVAL method can be recommended as a combination technique to secure and preserve inorganic and biological traces at the same time.

  14. Reduction and analysis of data collected during the electromagnetic tornado experiment

    NASA Technical Reports Server (NTRS)

    Davisson, L. D.

    1976-01-01

    Techniques for data processing and analysis are described to support tornado detection by analysis of radio frequency interference in various frequency bands, and sea state determination from short pulse radar measurements. Activities include: strip chart recording of tornado data; the development and implementation of computer programs for digitalization and analysis of the data; data reduction techniques for short pulse radar data, and the simulation of radar returns from the sea surface by computer models.

  15. Reliability techniques for computer executive programs

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Computer techniques for increasing the stability and reliability of executive and supervisory systems were studied. Program segmentation characteristics are discussed along with a validation system which is designed to retain the natural top down outlook in coding. An analysis of redundancy techniques and roll back procedures is included.

  16. Site 765: Sedimentology

    USGS Publications Warehouse

    ,

    1990-01-01

    Various techniques were used to decipher the sedimentation history of Site 765, including Markov chain analysis of facies transitions, XRD analysis of clay and other minerals, and multivariate analysis of smear-slide data, in addition to the standard descriptive procedures employed by the shipboard sedimentologist. This chapter presents brief summaries of methodology and major findings of these three techniques, a summary of the sedimentation history, and a discussion of trends in sedimentation through time.

  17. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  18. A strategy for selecting data mining techniques in metabolomics.

    PubMed

    Banimustafa, Ahmed Hmaidan; Hardy, Nigel W

    2012-01-01

    There is a general agreement that the development of metabolomics depends not only on advances in chemical analysis techniques but also on advances in computing and data analysis methods. Metabolomics data usually requires intensive pre-processing, analysis, and mining procedures. Selecting and applying such procedures requires attention to issues including justification, traceability, and reproducibility. We describe a strategy for selecting data mining techniques which takes into consideration the goals of data mining techniques on the one hand, and the goals of metabolomics investigations and the nature of the data on the other. The strategy aims to ensure the validity and soundness of results and promote the achievement of the investigation goals.

  19. Monitoring Air Quality with Leaf Yeasts.

    ERIC Educational Resources Information Center

    Richardson, D. H. S.; And Others

    1985-01-01

    Proposes that leaf yeast serve as quick, inexpensive, and effective techniques for monitoring air quality. Outlines procedures and provides suggestions for data analysis. Includes results from sample school groups who employed this technique. (ML)

  20. Waveguide design, modeling, and optimization: from photonic nanodevices to integrated photonic circuits

    NASA Astrophysics Data System (ADS)

    Bordovsky, Michal; Catrysse, Peter; Dods, Steven; Freitas, Marcio; Klein, Jackson; Kotacka, Libor; Tzolov, Velko; Uzunov, Ivan M.; Zhang, Jiazong

    2004-05-01

    We present the state of the art for commercial design and simulation software in the 'front end' of photonic circuit design. One recent advance is to extend the flexibility of the software by using more than one numerical technique on the same optical circuit. There are a number of popular and proven techniques for analysis of photonic devices. Examples of these techniques include the Beam Propagation Method (BPM), the Coupled Mode Theory (CMT), and the Finite Difference Time Domain (FDTD) method. For larger photonic circuits, it may not be practical to analyze the whole circuit by any one of these methods alone, but often some smaller part of the circuit lends itself to at least one of these standard techniques. Later the whole problem can be analyzed on a unified platform. This kind of approach can enable analysis for cases that would otherwise be cumbersome, or even impossible. We demonstrate solutions for more complex structures ranging from the sub-component layout, through the entire device characterization, to the mask layout and its editing. We also present recent advances in the above well established techniques. This includes the analysis of nano-particles, metals, and non-linear materials by FDTD, photonic crystal design and analysis, and improved models for high concentration Er/Yb co-doped glass waveguide amplifiers.

  1. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    PubMed Central

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  2. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments.

    PubMed

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto

    2013-04-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  3. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  4. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  5. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  6. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  7. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  8. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  9. Chromosome Analysis

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Perceptive Scientific Instruments, Inc., provides the foundation for the Powergene line of chromosome analysis and molecular genetic instrumentation. This product employs image processing technology from NASA's Jet Propulsion Laboratory and image enhancement techniques from Johnson Space Center. Originally developed to send pictures back to earth from space probes, digital imaging techniques have been developed and refined for use in a variety of medical applications, including diagnosis of disease.

  10. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Matthew W.

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less

  11. Neutron spectrometry for UF 6 enrichment verification in storage cylinders

    DOE PAGES

    Mengesha, Wondwosen; Kiff, Scott D.

    2015-01-29

    Verification of declared UF 6 enrichment and mass in storage cylinders is of great interest in nuclear material nonproliferation. Nondestructive assay (NDA) techniques are commonly used for safeguards inspections to ensure accountancy of declared nuclear materials. Common NDA techniques used include gamma-ray spectrometry and both passive and active neutron measurements. In the present study, neutron spectrometry was investigated for verification of UF 6 enrichment in 30B storage cylinders based on an unattended and passive measurement approach. MCNP5 and Geant4 simulated neutron spectra, for selected UF 6 enrichments and filling profiles, were used in the investigation. The simulated neutron spectra weremore » analyzed using principal component analysis (PCA). The PCA technique is a well-established technique and has a wide area of application including feature analysis, outlier detection, and gamma-ray spectral analysis. Results obtained demonstrate that neutron spectrometry supported by spectral feature analysis has potential for assaying UF 6 enrichment in storage cylinders. Thus the results from the present study also showed that difficulties associated with the UF 6 filling profile and observed in other unattended passive neutron measurements can possibly be overcome using the approach presented.« less

  12. A Passive System Reliability Analysis for a Station Blackout

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passivemore » system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.« less

  13. Position and Speed Control of Brushless DC Motors Using Sensorless Techniques and Application Trends

    PubMed Central

    Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime

    2010-01-01

    This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks. PMID:22163582

  14. CHEMICAL ANALYSIS METHODS FOR ATMOSPHERIC AEROSOL COMPONENTS

    EPA Science Inventory

    This chapter surveys the analytical techniques used to determine the concentrations of aerosol mass and its chemical components. The techniques surveyed include mass, major ions (sulfate, nitrate, ammonium), organic carbon, elemental carbon, and trace elements. As reported in...

  15. Development of a CFD Code for Analysis of Fluid Dynamic Forces in Seals

    NASA Technical Reports Server (NTRS)

    Athavale, Mahesh M.; Przekwas, Andrzej J.; Singhal, Ashok K.

    1991-01-01

    The aim is to develop a 3-D computational fluid dynamics (CFD) code for the analysis of fluid flow in cylindrical seals and evaluation of the dynamic forces on the seals. This code is expected to serve as a scientific tool for detailed flow analysis as well as a check for the accuracy of the 2D industrial codes. The features necessary in the CFD code are outlined. The initial focus was to develop or modify and implement new techniques and physical models. These include collocated grid formulation, rotating coordinate frames and moving grid formulation. Other advanced numerical techniques include higher order spatial and temporal differencing and an efficient linear equation solver. These techniques were implemented in a 2D flow solver for initial testing. Several benchmark test cases were computed using the 2D code, and the results of these were compared to analytical solutions or experimental data to check the accuracy. Tests presented here include planar wedge flow, flow due to an enclosed rotor, and flow in a 2D seal with a whirling rotor. Comparisons between numerical and experimental results for an annular seal and a 7-cavity labyrinth seal are also included.

  16. TOPICAL REVIEW: Human soft tissue analysis using x-ray or gamma-ray techniques

    NASA Astrophysics Data System (ADS)

    Theodorakou, C.; Farquharson, M. J.

    2008-06-01

    This topical review is intended to describe the x-ray techniques used for human soft tissue analysis. X-ray techniques have been applied to human soft tissue characterization and interesting results have been presented over the last few decades. The motivation behind such studies is to provide improved patient outcome by using the data obtained to better understand a disease process and improve diagnosis. An overview of theoretical background as well as a complete set of references is presented. For each study, a brief summary of the methodology and results is given. The x-ray techniques include x-ray diffraction, x-ray fluorescence, Compton scattering, Compton to coherent scattering ratio and attenuation measurements. The soft tissues that have been classified using x-rays or gamma rays include brain, breast, colon, fat, kidney, liver, lung, muscle, prostate, skin, thyroid and uterus.

  17. Authentication techniques for smart cards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thoroughmore » understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system.« less

  18. GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration

    PubMed Central

    Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng

    2015-01-01

    The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315

  19. Micromechanical Characterization and Texture Analysis of Direct Cast Titanium Alloys Strips

    NASA Technical Reports Server (NTRS)

    2000-01-01

    This research was conducted to determine a post-processing technique to optimize mechanical and material properties of a number of Titanium based alloys and aluminides processed via Melt Overflow Solidification Technique (MORST). This technique was developed by NASA for the development of thin sheet titanium and titanium aluminides used in high temperature applications. The materials investigated in this study included conventional titanium alloy strips and foils, Ti-1100, Ti-24Al-11Nb (Alpha-2), and Ti-48Al-2Ta (Gamma). The methodology used included micro-characterization, heat-treatment, mechanical processing and mechanical testing. Characterization techniques included optical, electron microscopy, and x-ray texture analysis. The processing included heat-treatment and mechanical deformation through cold rolling. The initial as-cast materials were evaluated for their microstructure and mechanical properties. Different heat-treatment and rolling steps were chosen to process these materials. The properties were evaluated further and a processing relationship was established in order to obtain an optimum processing condition. The results showed that the as-cast material exhibited a Widmanstatten (fine grain) microstructure that developed into a microstructure with larger grains through processing steps. The texture intensity showed little change for all processing performed in this investigation.

  20. On the Power of Abstract Interpretation

    NASA Technical Reports Server (NTRS)

    Reddy, Uday S.; Kamin, Samuel N.

    1991-01-01

    Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.

  1. Computer-delivered interventions for reducing alcohol consumption: meta-analysis and meta-regression using behaviour change techniques and theory.

    PubMed

    Black, Nicola; Mullan, Barbara; Sharpe, Louise

    2016-09-01

    The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.

  2. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

  3. Chemical Principles Revisited: Archaeological Dating.

    ERIC Educational Resources Information Center

    Rowe, M. W.

    1986-01-01

    Discusses methods used to date archaeological artifacts and other remains. They include: (1) nuclear dating techniques (radiocarbon dating, accelerator radiocarbon dating, thermoluminescence, and others); (2) chemical dating techniques (amino acid racemization, obsidian hydration dating, elemental content changes, and thermal analysis dating); and…

  4. Space Construction System Analysis. Part 2: Executive summary

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A detailed, end-to-end analysis of the activities, techniques, equipment and Shuttle provisions required to construct a reference project system is described. Included are: platform definition; construction analysis; cost and programmatics; and space construction experiments concepts.

  5. Techniques and Tools of NASA's Space Shuttle Columbia Accident Investigation

    NASA Technical Reports Server (NTRS)

    McDanels, Steve J.

    2005-01-01

    The Space Shuttle Columbia accident investigation was a fusion of many disciplines into a single effort. From the recovery and reconstruction of the debris, Figure 1, to the analysis, both destructive and nondestructive, of chemical and metallurgical samples, Figure 2, a multitude of analytical techniques and tools were employed. Destructive and non-destructive testing were utilized in tandem to determine if a breach in the left wing of the Orbiter had occurred, and if so, the path of the resultant high temperature plasma flow. Nondestructive analysis included topometric scanning, laser mapping, and real-time radiography. These techniques were useful in constructing a three dimensional virtual representation of the reconstruction project, specifically the left wing leading edge reinforced carbon/carbon heat protectant panels. Similarly, they were beneficial in determining where sampling should be performed on the debris. Analytic testing included such techniques as Energy Dispersive Electron Microprobe Analysis (EMPA), Electron Spectroscopy Chemical Analysis (ESCA), and X-Ray dot mapping; these techniques related the characteristics of intermetallics deposited on the leading edge of the left wing adjacent to the location of a suspected plasma breach during reentry. The methods and results of the various analyses, along with their implications into the accident, are discussed, along with the findings and recommendations of the Columbia Accident Investigation Board. Likewise, NASA's Return To Flight efforts are highlighted.

  6. Detection techniques for tenuous planetary atmospheres

    NASA Technical Reports Server (NTRS)

    Hoenig, S. A.

    1972-01-01

    The research for the development of new types of detectors for analysis of planetary atmospheres is summarized. Topics discussed include: corona discharge humidity detector, surface catalysis and exo-electron emission, and analysis of soil samples by means of exo-electron emission. A report on the exo-electron emission during heterogeneous catalysis is included.

  7. Using EIGER for Antenna Design and Analysis

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Khayat, Michael; Kennedy, Timothy F.; Fink, Patrick W.

    2007-01-01

    EIGER (Electromagnetic Interactions GenERalized) is a frequency-domain electromagnetics software package that is built upon a flexible framework, designed using object-oriented techniques. The analysis methods used include moment method solutions of integral equations, finite element solutions of partial differential equations, and combinations thereof. The framework design permits new analysis techniques (boundary conditions, Green#s functions, etc.) to be added to the software suite with a sensible effort. The code has been designed to execute (in serial or parallel) on a wide variety of platforms from Intel-based PCs and Unix-based workstations. Recently, new potential integration scheme s that avoid singularity extraction techniques have been added for integral equation analysis. These new integration schemes are required for facilitating the use of higher-order elements and basis functions. Higher-order elements are better able to model geometrical curvature using fewer elements than when using linear elements. Higher-order basis functions are beneficial for simulating structures with rapidly varying fields or currents. Results presented here will demonstrate curren t and future capabilities of EIGER with respect to analysis of installed antenna system performance in support of NASA#s mission of exploration. Examples include antenna coupling within an enclosed environment and antenna analysis on electrically large manned space vehicles.

  8. Multispectral Analysis of NMR Imagery

    NASA Technical Reports Server (NTRS)

    Butterfield, R. L.; Vannier, M. W. And Associates; Jordan, D.

    1985-01-01

    Conference paper discusses initial efforts to adapt multispectral satellite-image analysis to nuclear magnetic resonance (NMR) scans of human body. Flexibility of these techniques makes it possible to present NMR data in variety of formats, including pseudocolor composite images of pathological internal features. Techniques do not have to be greatly modified from form in which used to produce satellite maps of such Earth features as water, rock, or foliage.

  9. Factor weighting in DRASTIC modeling.

    PubMed

    Pacheco, F A L; Pires, L M G R; Santos, R M B; Sanches Fernandes, L F

    2015-02-01

    Evaluation of aquifer vulnerability comprehends the integration of very diverse data, including soil characteristics (texture), hydrologic settings (recharge), aquifer properties (hydraulic conductivity), environmental parameters (relief), and ground water quality (nitrate contamination). It is therefore a multi-geosphere problem to be handled by a multidisciplinary team. The DRASTIC model remains the most popular technique in use for aquifer vulnerability assessments. The algorithm calculates an intrinsic vulnerability index based on a weighted addition of seven factors. In many studies, the method is subject to adjustments, especially in the factor weights, to meet the particularities of the studied regions. However, adjustments made by different techniques may lead to markedly different vulnerabilities and hence to insecurity in the selection of an appropriate technique. This paper reports the comparison of 5 weighting techniques, an enterprise not attempted before. The studied area comprises 26 aquifer systems located in Portugal. The tested approaches include: the Delphi consensus (original DRASTIC, used as reference), Sensitivity Analysis, Spearman correlations, Logistic Regression and Correspondence Analysis (used as adjustment techniques). In all cases but Sensitivity Analysis, adjustment techniques have privileged the factors representing soil characteristics, hydrologic settings, aquifer properties and environmental parameters, by leveling their weights to ≈4.4, and have subordinated the factors describing the aquifer media by downgrading their weights to ≈1.5. Logistic Regression predicts the highest and Sensitivity Analysis the lowest vulnerabilities. Overall, the vulnerability indices may be separated by a maximum value of 51 points. This represents an uncertainty of 2.5 vulnerability classes, because they are 20 points wide. Given this ambiguity, the selection of a weighting technique to integrate a vulnerability index may require additional expertise to be set up satisfactorily. Following a general criterion that weights must be proportional to the range of the ratings, Correspondence Analysis may be recommended as the best adjustment technique. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Meta-analysis of studies assessing the efficacy of projective techniques in discriminating child sexual abuse.

    PubMed

    West, M M

    1998-11-01

    This meta-analysis of 12 studies assesses the efficacy of projective techniques to discriminate between sexually abused children and nonsexually abused children. A literature search was conducted to identify published studies that used projective instruments with sexually abused children. Those studies that reported statistics that allowed for an effect size to be calculated, were then included in the meta-analysis. There were 12 studies that fit the criteria. The projectives reviewed include The Rorschach, The Hand Test, The Thematic Apperception Test (TAT), the Kinetic Family Drawings, Human Figure Drawings, Draw Your Favorite Kind of Day, The Rosebush: A Visualization Strategy, and The House-Tree-Person. The results of this analysis gave an over-all effect size of d = .81, which is a large effect. Six studies included only a norm group of nondistressed, nonabused children with the sexual abuse group. The average effect size was d = .87, which is impressive. Six studies did include a clinical group of distressed nonsexually abused subjects and the effect size lowered to d = .76, which is a medium to large effect. This indicates that projective instruments can discriminate distressed children from nondistressed subjects, quite well. In the studies that included a clinical group of distressed children who were not sexually abused, the lower effect size indicates that the instruments were less able to discriminate the type of distress. This meta-analysis gives evidence that projective techniques have the ability to discriminate between children who have been sexually abused and those who were not abused sexually. However, further research that is designed to include clinical groups of distressed children is needed in order to determine how well the projectives can discriminate the type of distress.

  11. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction

    PubMed Central

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-01-01

    Abstract Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction. The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus. We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar. The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials. We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise. A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75–5.52, P < 0.0001). Patients receiving the piezoelectric technique had less swelling at postoperative days 1, 3, 5, and 7 (all Ps ≤0.023). Additionally, there was a trend of less postoperative pain and trismus in the piezosurgery groups. The number of included randomized controlled trials and the sample size of each trial were relatively small, double blinding was not possible, and cost analysis was unavailable due to a lack of data. Our meta-analysis indicates that although patients undergoing piezosurgery experienced longer surgery time, they had less postoperative swelling, indicating that piezosurgery is a promising alternative technique for extraction of impacted third molars. PMID:26469902

  12. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  13. Analysis strategies for high-resolution UHF-fMRI data.

    PubMed

    Polimeni, Jonathan R; Renvall, Ville; Zaretskaya, Natalia; Fischl, Bruce

    2018-03-01

    Functional MRI (fMRI) benefits from both increased sensitivity and specificity with increasing magnetic field strength, making it a key application for Ultra-High Field (UHF) MRI scanners. Most UHF-fMRI studies utilize the dramatic increases in sensitivity and specificity to acquire high-resolution data reaching sub-millimeter scales, which enable new classes of experiments to probe the functional organization of the human brain. This review article surveys advanced data analysis strategies developed for high-resolution fMRI at UHF. These include strategies designed to mitigate distortion and artifacts associated with higher fields in ways that attempt to preserve spatial resolution of the fMRI data, as well as recently introduced analysis techniques that are enabled by these extremely high-resolution data. Particular focus is placed on anatomically-informed analyses, including cortical surface-based analysis, which are powerful techniques that can guide each step of the analysis from preprocessing to statistical analysis to interpretation and visualization. New intracortical analysis techniques for laminar and columnar fMRI are also reviewed and discussed. Prospects for single-subject individualized analyses are also presented and discussed. Altogether, there are both specific challenges and opportunities presented by UHF-fMRI, and the use of proper analysis strategies can help these valuable data reach their full potential. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A method for nonlinear exponential regression analysis

    NASA Technical Reports Server (NTRS)

    Junkin, B. G.

    1971-01-01

    A computer-oriented technique is presented for performing a nonlinear exponential regression analysis on decay-type experimental data. The technique involves the least squares procedure wherein the nonlinear problem is linearized by expansion in a Taylor series. A linear curve fitting procedure for determining the initial nominal estimates for the unknown exponential model parameters is included as an integral part of the technique. A correction matrix was derived and then applied to the nominal estimate to produce an improved set of model parameters. The solution cycle is repeated until some predetermined criterion is satisfied.

  15. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  16. An iterative analytical technique for the design of interplanetary direct transfer trajectories including perturbations

    NASA Astrophysics Data System (ADS)

    Parvathi, S. P.; Ramanan, R. V.

    2018-06-01

    An iterative analytical trajectory design technique that includes perturbations in the departure phase of the interplanetary orbiter missions is proposed. The perturbations such as non-spherical gravity of Earth and the third body perturbations due to Sun and Moon are included in the analytical design process. In the design process, first the design is obtained using the iterative patched conic technique without including the perturbations and then modified to include the perturbations. The modification is based on, (i) backward analytical propagation of the state vector obtained from the iterative patched conic technique at the sphere of influence by including the perturbations, and (ii) quantification of deviations in the orbital elements at periapsis of the departure hyperbolic orbit. The orbital elements at the sphere of influence are changed to nullify the deviations at the periapsis. The analytical backward propagation is carried out using the linear approximation technique. The new analytical design technique, named as biased iterative patched conic technique, does not depend upon numerical integration and all computations are carried out using closed form expressions. The improved design is very close to the numerical design. The design analysis using the proposed technique provides a realistic insight into the mission aspects. Also, the proposed design is an excellent initial guess for numerical refinement and helps arrive at the four distinct design options for a given opportunity.

  17. Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.

    ERIC Educational Resources Information Center

    Bertrand, Jane T.; And Others

    1989-01-01

    An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)

  18. Observation of the Earth by radar

    NASA Technical Reports Server (NTRS)

    Elachi, C.

    1982-01-01

    Techniques and applications of radar observation from Earth satellites are discussed. Images processing and analysis of these images are discussed. Also discussed is radar imaging from aircraft. Uses of this data include ocean wave analysis, surface water evaluation, and topographic analysis.

  19. E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence

    DTIC Science & Technology

    2018-03-01

    actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone

  20. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  1. Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009

    USGS Publications Warehouse

    Soller, David R.

    2011-01-01

    As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  2. Real-Time Condition Monitoring and Fault Diagnosis of Gear Train Systems Using Instantaneous Angular Speed (IAS) Analysis

    NASA Astrophysics Data System (ADS)

    Sait, Abdulrahman S.

    This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.

  3. Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing

    PubMed Central

    Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko

    2015-01-01

    The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523

  4. Module Degradation Mechanisms Studied by a Multi-Scale Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, Steve; Al-Jassim, Mowafak; Hacke, Peter

    2016-11-21

    A key pathway to meeting the Department of Energy SunShot 2020 goals is to reduce financing costs by improving investor confidence through improved photovoltaic (PV) module reliability. A comprehensive approach to further understand and improve PV reliability includes characterization techniques and modeling from module to atomic scale. Imaging techniques, which include photoluminescence, electroluminescence, and lock-in thermography, are used to locate localized defects responsible for module degradation. Small area samples containing such defects are prepared using coring techniques and are then suitable and available for microscopic study and specific defect modeling and analysis.

  5. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  6. The incidence of secondary vertebral fracture of vertebral augmentation techniques versus conservative treatment for painful osteoporotic vertebral fractures: a systematic review and meta-analysis.

    PubMed

    Song, Dawei; Meng, Bin; Gan, Minfeng; Niu, Junjie; Li, Shiyan; Chen, Hao; Yuan, Chenxi; Yang, Huilin

    2015-08-01

    Percutaneous vertebroplasty (PVP) and balloon kyphoplasty (BKP) are minimally invasive and effective vertebral augmentation techniques for managing osteoporotic vertebral compression fractures (OVCFs). Recent meta-analyses have compared the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques or conservative treatment; however, the inclusions were not thorough and rigorous enough, and the effects of each technique on the incidence of secondary vertebral fractures remain unclear. To perform an updated systematic review and meta-analysis of the studies with more rigorous inclusion criteria on the effects of vertebral augmentation techniques and conservative treatment for OVCF on the incidence of secondary vertebral fractures. PubMed, MEDLINE, EMBASE, SpringerLink, Web of Science, and the Cochrane Library database were searched for relevant original articles comparing the incidence of secondary vertebral fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. Randomized controlled trials (RCTs) and prospective non-randomized controlled trials (NRCTs) were identified. The methodological qualities of the studies were evaluated, relevant data were extracted and recorded, and an appropriate meta-analysis was conducted. A total of 13 articles were included. The pooled results from included studies showed no statistically significant differences in the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques and conservative treatment. Subgroup analysis comparing different study designs, durations of symptoms, follow-up times, races of patients, and techniques were conducted, and no significant differences in the incidence of secondary fractures were identified (P > 0.05). No obvious publication bias was detected by either Begg's test (P = 0.360 > 0.05) or Egger's test (P = 0.373 > 0.05). Despite current thinking in the field that vertebral augmentation procedures may increase the incidence of secondary fractures, we found no differences in the incidence of secondary fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. © The Foundation Acta Radiologica 2014.

  7. DIMENSIONS OF SIMULATION.

    ERIC Educational Resources Information Center

    CRAWFORD, MEREDITH P.

    OPEN AND CLOSED LOOP SIMULATION IS DISCUSSED FROM THE VIEWPOINT OF RESEARCH AND DEVELOPMENT IN TRAINING TECHNIQUES. AREAS DISCUSSED INCLUDE--(1) OPEN-LOOP ENVIRONMENTAL SIMULATION, (2) SIMULATION NOT INVOLVING PEOPLE, (3) ANALYSIS OF OCCUPATIONS, (4) SIMULATION FOR TRAINING, (5) REAL-SIZE SYSTEM SIMULATION, (6) TECHNIQUES OF MINIATURIZATION, AND…

  8. Hanford Environmental Analytical Methods (methods as of March 1990). Volume 2, Appendix A1-O and appendix A1-I

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goheen, S.C.; McCulloch, M.; Daniel, J.L.

    1993-05-01

    Techniques in use at the Hanford Reservation as of March, 1990 for the analysis of liquids, organic wastes, soils, and sediments, are described. Limitations and applications of the techniques are included.

  9. Inferior or double joint spaces injection versus superior joint space injection for temporomandibular disorders: a systematic review and meta-analysis.

    PubMed

    Li, Chunjie; Zhang, Yifan; Lv, Jun; Shi, Zongdao

    2012-01-01

    To compare the effect and safety of inferior or double temporomandibular joint spaces drug injection versus superior temporomandibular joint space injection in the treatment of temporomandibular disorders. MEDLINE (via Ovid, 1948 to March 2011), CENTRAL (Issue 1, 2011), Embase (1984 to March 2011), CBM (1978 to March 2011), and World Health Organization International Clinical Trials Registry Platform were searched electronically; relevant journals as well as references of included studies were hand-searched for randomized controlled trials comparing effect or safety of inferior or double joint spaces drug injection technique with those of superior space injection technique. Risk of bias assessment with the tool recommended by Cochrane Collaboration, reporting quality assessment with CONSORT and data extraction, were carried out independently by 2 reviewers. Meta-analysis was delivered with RevMan 5.0.23. Four trials with 349 participants were included. All the included studies had moderate risk of bias. Meta-analysis showed that inferior or double spaces injection technique could significantly increase 2.88 mm more maximal mouth opening (P = .0001) and alleviate pain intensity in the temporomandibular area on average by 9.01 mm visual analog scale scores (P = .0001) compared with superior space injection technique, but could not markedly change synthesized clinical index (P = .05) in the short term; nevertheless, they showed more beneficial maximal mouth opening (P = .002), pain relief (P < .0001), and synthesized clinical variable (P < .0001) in the long term than superior space injection. No serious adverse events were reported. Inferior or double temporomandibular joint spaces drug injection technique shows better effect than superior space injection technique, and their safety is affirmative. However, more high-quality studies are still needed to test and verify the evidence. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  10. Gravitational waves: search results, data analysis and parameter estimation: Amaldi 10 Parallel session C2.

    PubMed

    Astone, Pia; Weinstein, Alan; Agathos, Michalis; Bejger, Michał; Christensen, Nelson; Dent, Thomas; Graff, Philip; Klimenko, Sergey; Mazzolo, Giulio; Nishizawa, Atsushi; Robinet, Florent; Schmidt, Patricia; Smith, Rory; Veitch, John; Wade, Madeline; Aoudia, Sofiane; Bose, Sukanta; Calderon Bustillo, Juan; Canizares, Priscilla; Capano, Colin; Clark, James; Colla, Alberto; Cuoco, Elena; Da Silva Costa, Carlos; Dal Canton, Tito; Evangelista, Edgar; Goetz, Evan; Gupta, Anuradha; Hannam, Mark; Keitel, David; Lackey, Benjamin; Logue, Joshua; Mohapatra, Satyanarayan; Piergiovanni, Francesco; Privitera, Stephen; Prix, Reinhard; Pürrer, Michael; Re, Virginia; Serafinelli, Roberto; Wade, Leslie; Wen, Linqing; Wette, Karl; Whelan, John; Palomba, C; Prodi, G

    The Amaldi 10 Parallel Session C2 on gravitational wave (GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity.

  11. Gravitational Waves: Search Results, Data Analysis and Parameter Estimation. Amaldi 10 Parallel Session C2

    NASA Technical Reports Server (NTRS)

    Astone, Pia; Weinstein, Alan; Agathos, Michalis; Bejger, Michal; Christensen, Nelson; Dent, Thomas; Graff, Philip; Klimenko, Sergey; Mazzolo, Giulio; Nishizawa, Atsushi

    2015-01-01

    The Amaldi 10 Parallel Session C2 on gravitational wave(GW) search results, data analysis and parameter estimation included three lively sessions of lectures by 13 presenters, and 34 posters. The talks and posters covered a huge range of material, including results and analysis techniques for ground-based GW detectors, targeting anticipated signals from different astrophysical sources: compact binary inspiral, merger and ringdown; GW bursts from intermediate mass binary black hole mergers, cosmic string cusps, core-collapse supernovae, and other unmodeled sources; continuous waves from spinning neutron stars; and a stochastic GW background. There was considerable emphasis on Bayesian techniques for estimating the parameters of coalescing compact binary systems from the gravitational waveforms extracted from the data from the advanced detector network. This included methods to distinguish deviations of the signals from what is expected in the context of General Relativity.

  12. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  13. Planning for Cost Effectiveness.

    ERIC Educational Resources Information Center

    Schlaebitz, William D.

    1984-01-01

    A heat pump life-cycle cost analysis is used to explain the technique. Items suggested for the life-cycle analysis approach include lighting, longer-life batteries, site maintenance, and retaining experts to inspect specific building components. (MLF)

  14. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  15. Fluorescence Fluctuation Approaches to the Study of Adhesion and Signaling

    PubMed Central

    Bachir, Alexia I.; Kubow, Kristopher E.; Horwitz, Alan R.

    2013-01-01

    Cell–matrix adhesions are large, multimolecular complexes through which cells sense and respond to their environment. They also mediate migration by serving as traction points and signaling centers and allow the cell to modify the surroucnding tissue. Due to their fundamental role in cell behavior, adhesions are germane to nearly all major human health pathologies. However, adhesions are extremely complex and dynamic structures that include over 100 known interacting proteins and operate over multiple space (nm–µm) and time (ms–min) regimes. Fluorescence fluctuation techniques are well suited for studying adhesions. These methods are sensitive over a large spatiotemporal range and provide a wealth of information including molecular transport dynamics, interactions, and stoichiometry from a single time series. Earlier chapters in this volume have provided the theoretical background, instrumentation, and analysis algorithms for these techniques. In this chapter, we discuss their implementation in living cells to study adhesions in migrating cells. Although each technique and application has its own unique instrumentation and analysis requirements, we provide general guidelines for sample preparation, selection of imaging instrumentation, and optimization of data acquisition and analysis parameters. Finally, we review several recent studies that implement these techniques in the study of adhesions. PMID:23280111

  16. POD/MAC-Based Modal Basis Selection for a Reduced Order Nonlinear Response Analysis

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2007-01-01

    A feasibility study was conducted to explore the applicability of a POD/MAC basis selection technique to a nonlinear structural response analysis. For the case studied the application of the POD/MAC technique resulted in a substantial improvement of the reduced order simulation when compared to a classic approach utilizing only low frequency modes present in the excitation bandwidth. Further studies are aimed to expand application of the presented technique to more complex structures including non-planar and two-dimensional configurations. For non-planar structures the separation of different displacement components may not be necessary or desirable.

  17. Atlas of computerized blood flow analysis in bone disease.

    PubMed

    Gandsman, E J; Deutsch, S D; Tyson, I B

    1983-11-01

    The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.

  18. Blue dye for identification of sentinel nodes in breast cancer and malignant melanoma: a systematic review and meta-analysis.

    PubMed

    Peek, Mirjam Cl; Charalampoudis, Petros; Anninga, Bauke; Baker, Rose; Douek, Michael

    2017-02-01

    The combined technique (radioisotope and blue dye) is the gold standard for sentinel lymph node biopsy (SLNB) and there is wide variation in techniques and blue dyes used. We performed a systematic review and meta-analysis to assess the need for radioisotope and the optimal blue dye for SLNB. A total of 21 studies were included. The SLNB identification rates are high with all the commonly used blue dyes. Furthermore, methylene blue is superior to iso-sulfan blue and Patent Blue V with respect to false-negative rates. The combined technique remains the most accurate and effective technique for SLNB. In order to standardize the SLNB technique, comparative trials to determine the most effective blue dye and national guidelines are required.

  19. Proceedings of our national landscape: a conference on applied techniques for analysis and management of the visual resource [Incline Village, Nev., April 23-25, 1979

    Treesearch

    Gary H. Elsner; Richard C. Smardon; technical coordinators

    1979-01-01

    These 104 papers were presented at "Our National Landscape: A Conference on Applied Techniques for Analysis and Management of the Visual Resource," Incline Village, Nevada, April 23-25, 1979. Included in this proceedings are state-of-the-art papers on landscape planning. Emphasis is upon planning the visual aspects of the large and wildland areas of the...

  20. The integrated manual and automatic control of complex flight systems

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1986-01-01

    The topics of research in this program include pilot/vehicle analysis techniques, identification of pilot dynamics, and control and display synthesis techniques for optimizing aircraft handling qualities. The project activities are discussed. The current technical activity is directed at extending and validating the active display synthesis procedure, and the pilot/vehicle analysis of the NLR rate-command flight configurations in the landing task. Two papers published by the researchers are attached as appendices.

  1. Current techniques for the real-time processing of complex radar signatures

    NASA Astrophysics Data System (ADS)

    Clay, E.

    A real-time processing technique has been developed for the microwave receiver of the Brahms radar station. The method allows such target signatures as the radar cross section (RCS) of the airframes and rotating parts, the one-dimensional tomography of aircraft, and the RCS of electromagnetic decoys to be characterized. The method allows optimization of experimental parameters including the analysis frequency band, the receiver gain, and the wavelength range of EM analysis.

  2. Planning for Downtown Circulation Systems. Volume 2. Analysis Techniques.

    DOT National Transportation Integrated Search

    1983-10-01

    This volume contains the analysis and refinement stages of downtown circulator planning. Included are sections on methods for estimating patronage, costs, revenues, and impacts, and a section on methods for performing micro-level analyses.

  3. Extracranial glioblastoma diagnosed by examination of pleural effusion using the cell block technique: case report.

    PubMed

    Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko

    2018-06-01

    Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.

  4. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  5. Structural analysis of cell wall polysaccharides using PACE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mortimer, Jennifer C.

    The plant cell wall is composed of many complex polysaccharides. The composition and structure of the polysaccharides affect various cell properties including cell shape, cell function and cell adhesion. Many techniques to characterize polysaccharide structure are complicated, requiring expensive equipment and specialized operators e.g. NMR, MALDI-MS. PACE (Polysaccharide Analysis using Carbohydrate gel Electrophoresis) uses a simple, rapid technique to analyze polysaccharide quantity and structure (Goubet et al. 2002). Whilst the method here describes xylan analysis, it can be applied (by use of the appropriate glycosyl hydrolase) to any cell wall polysaccharide.

  6. LIVER ULTRASONOGRAPHY IN DOLPHINS: USE OF ULTRASONOGRAPHY TO ESTABLISH A TECHNIQUE FOR HEPATOBILIARY IMAGING AND TO EVALUATE METABOLIC DISEASE-ASSOCIATED LIVER CHANGES IN BOTTLENOSE DOLPHINS (TURSIOPS TRUNCATUS).

    PubMed

    Seitz, Kelsey E; Smith, Cynthia R; Marks, Stanley L; Venn-Watson, Stephanie K; Ivančić, Marina

    2016-12-01

    The objective of this study was to establish a comprehensive technique for ultrasound examination of the dolphin hepatobiliary system and apply this technique to 30 dolphins to determine what, if any, sonographic changes are associated with blood-based indicators of metabolic syndrome (insulin greater than 14 μIU/ml or glucose greater than 112 mg/dl) and iron overload (transferrin saturation greater than 65%). A prospective study of individuals in a cross-sectional population with and without elevated postprandial insulin levels was performed. Twenty-nine bottlenose dolphins ( Tursiops truncatus ) in a managed collection were included in the final data analysis. An in-water ultrasound technique was developed that included detailed analysis of the liver and pancreas. Dolphins with hyperinsulinemia concentrations had larger livers compared with dolphins with nonelevated concentrations. Using stepwise, multivariate regression including blood-based indicators of metabolic syndrome in dolphins, glucose was the best predictor of and had a positive linear association with liver size (P = 0.007, R 2 = 0.24). Bottlenose dolphins are susceptible to metabolic syndrome and associated complications that affect the liver, including fatty liver disease and iron overload. This study facilitated the establishment of a technique for a rapid, diagnostic, and noninvasive ultrasonographic evaluation of the dolphin liver. In addition, the study identified ultrasound-detectable hepatic changes associated primarily with elevated glucose concentration in dolphins. Future investigations will strive to detail the pathophysiological mechanisms for these changes.

  7. Physiologic Waveform Analysis for Early Detection of Hemorrhage during Transport and Higher Echelon Medical Care of Combat Casualties

    DTIC Science & Technology

    2014-03-01

    waveforms that are easier to measure than ABP (e.g., pulse oximeter waveforms); (3) a NIH SBIR Phase I proposal with Retia Medical to develop automated...the training dataset. Integrating the technique with non-invasive pulse transit time (PTT) was most effective. The integrated technique specifically...the peripheral ABP waveforms in the training dataset. These techniques included the rudimentary mean ABP technique, the classic pulse pressure times

  8. Efficacy of physical activity interventions in post-natal populations: systematic review, meta-analysis and content coding of behaviour change techniques.

    PubMed

    Gilinsky, Alyssa Sara; Dale, Hannah; Robinson, Clare; Hughes, Adrienne R; McInnes, Rhona; Lavallee, David

    2015-01-01

    This systematic review and meta-analysis reports the efficacy of post-natal physical activity change interventions with content coding of behaviour change techniques (BCTs). Electronic databases (MEDLINE, CINAHL and PsychINFO) were searched for interventions published from January 1980 to July 2013. Inclusion criteria were: (i) interventions including ≥1 BCT designed to change physical activity behaviour, (ii) studies reporting ≥1 physical activity outcome, (iii) interventions commencing later than four weeks after childbirth and (iv) studies including participants who had given birth within the last year. Controlled trials were included in the meta-analysis. Interventions were coded using the 40-item Coventry, Aberdeen & London - Refined (CALO-RE) taxonomy of BCTs and study quality assessment was conducted using Cochrane criteria. Twenty studies were included in the review (meta-analysis: n = 14). Seven were interventions conducted with healthy inactive post-natal women. Nine were post-natal weight management studies. Two studies included women with post-natal depression. Two studies focused on improving general well-being. Studies in healthy populations but not for weight management successfully changed physical activity. Interventions increased frequency but not volume of physical activity or walking behaviour. Efficacious interventions always included the BCTs 'goal setting (behaviour)' and 'prompt self-monitoring of behaviour'.

  9. Comparison of soft tissue balancing, femoral component rotation, and joint line change between the gap balancing and measured resection techniques in primary total knee arthroplasty: A meta-analysis.

    PubMed

    Moon, Young-Wan; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Park, Chan-Deok; Lee, Dae-Hee

    2016-09-01

    This meta-analysis was designed to compare the accuracy of soft tissue balancing and femoral component rotation as well as change in joint line positions, between the measured resection and gap balancing techniques in primary total knee arthroplasty. Studies were included in the meta-analysis if they compared soft tissue balancing and/or radiologic outcomes in patients who underwent total knee arthroplasty with the gap balancing and measured resection techniques. Comparisons included differences in flexion/extension, medial/lateral flexion, and medial/lateral extension gaps (LEGs), femoral component rotation, and change in joint line positions. Finally, 8 studies identified via electronic (MEDLINE, EMBASE, and the Cochrane Library) and manual searches were included. All 8 studies showed a low risk of selection bias and provided detailed demographic data. There was some inherent heterogeneity due to uncontrolled bias, because all included studies were observational comparison studies. The pooled mean difference in gap differences between the gap balancing and measured resection techniques did not differ significantly (-0.09 mm, 95% confidence interval [CI]: -0.40 to +0.21 mm; P = 0.55), except that the medial/LEG difference was 0.58 mm greater for measured resection than gap balancing (95% CI: -1.01 to -0.15 mm; P = 0.008). Conversely, the pooled mean difference in femoral component external rotation (0.77°, 95% CI: 0.18° to 1.35°; P = 0.01) and joint line change (1.17 mm, 95% CI: 0.82 to 1.52 mm; P < 0.001) were significantly greater for the gap balancing than the measured resection technique. The gap balancing and measured resection techniques showed similar soft tissue balancing, except for medial/LEG difference. However, the femoral component was more externally rotated and the joint line was more elevated with gap balancing than measured resection. These differences were minimal (around 1 mm or 1°) and therefore may have little effect on the biomechanics of the knee joint. This suggests that the gap balancing and measured resection techniques are not mutually exclusive.

  10. Crash Certification by Analysis - Are We There Yet?

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Fasanella, Edwin L.; Lyle, Karen H.

    2006-01-01

    This paper addresses the issue of crash certification by analysis. This broad topic encompasses many ancillary issues including model validation procedures, uncertainty in test data and analysis models, probabilistic techniques for test-analysis correlation, verification of the mathematical formulation, and establishment of appropriate qualification requirements. This paper will focus on certification requirements for crashworthiness of military helicopters; capabilities of the current analysis codes used for crash modeling and simulation, including some examples of simulations from the literature to illustrate the current approach to model validation; and future directions needed to achieve "crash certification by analysis."

  11. The Expanding Role of the Atom in the Humanities

    ERIC Educational Resources Information Center

    Seaborg, Glenn T.

    1970-01-01

    The techniques of radioactive dating, thermoluminescence dating, cesium magnetometer detecting, x-ray flourescence analysis, and neutron radiography are briefly explained. Examples are given in the use of techniques in determining age and composition of paintings, ceramics, and archeological finds. Included is a history of Lawrence Radiation…

  12. 40 CFR 85.2120 - Maintenance and submittal of records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... testing program, including all production part sampling techniques used to verify compliance of the... subsequent analyses of that data; (7) A description of all the methodology, analysis, testing and/or sampling techniques used to ascertain the emission critical parameter specifications of the originial equipment part...

  13. Elementary review of electron microprobe techniques and correction requirements

    NASA Technical Reports Server (NTRS)

    Hart, R. K.

    1968-01-01

    Report contains requirements for correction of instrumented data on the chemical composition of a specimen, obtained by electron microprobe analysis. A condensed review of electron microprobe techniques is presented, including background material for obtaining X ray intensity data corrections and absorption, atomic number, and fluorescence corrections.

  14. Conference Planning.

    ERIC Educational Resources Information Center

    Carter, Richard

    1982-01-01

    Presents an overview of the management planning technique known as Break Even Analysis and outlines its use as a tool in financial planning for organizations intending to conduct or sponsor a conference, seminar, or workshop. Three figures illustrating Break Even Analysis concepts and a Break Even Analysis worksheet are included. (JL)

  15. Neutron scattering for the analysis of biological structures. Brookhaven symposia in biology. Number 27

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenborn, B P

    1976-01-01

    Sessions were included on neutron scattering and biological structure analysis, protein crystallography, neutron scattering from oriented systems, solution scattering, preparation of deuterated specimens, inelastic scattering, data analysis, experimental techniques, and instrumentation. Separate entries were made for the individual papers.

  16. Advanced techniques for determining long term compatibility of materials with propellants

    NASA Technical Reports Server (NTRS)

    Green, R. L.; Stebbins, J. P.; Smith, A. W.; Pullen, K. E.

    1973-01-01

    A method for the prediction of propellant-material compatibility for periods of time up to ten years is presented. Advanced sensitive measurement techniques used in the prediction method are described. These include: neutron activation analysis, radioactive tracer technique, and atomic absorption spectroscopy with a graphite tube furnace sampler. The results of laboratory tests performed to verify the prediction method are presented.

  17. Moving Target Techniques: Leveraging Uncertainty for CyberDefense

    DTIC Science & Technology

    2015-12-15

    cyberattacks is a continual struggle for system managers. Attackers often need only find one vulnerability (a flaw or bug that an attacker can exploit...additional parsing code itself could have security-relevant software bugs . Dynamic  Network   Techniques in the dynamic network domain change the...evaluation of MT techniques can benefit from a variety of evaluation approaches, including abstract analysis, modeling and simulation, test bed

  18. Application of Information-Theoretic Data Mining Techniques in a National Ambulatory Practice Outcomes Research Network

    PubMed Central

    Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin

    2005-01-01

    The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156

  19. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  20. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  1. Taming the Wild: A Unified Analysis of Hogwild!-Style Algorithms.

    PubMed

    De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher

    2015-12-01

    Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we use our new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild!) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild!, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.

  2. Enrichment and separation techniques for large-scale proteomics analysis of the protein post-translational modifications.

    PubMed

    Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa

    2014-11-06

    Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake: Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  4. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake - Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  5. A systematic comparison of the closed shoulder reduction techniques.

    PubMed

    Alkaduhimi, H; van der Linde, J A; Willigenburg, N W; van Deurzen, D F P; van den Bekerom, M P J

    2017-05-01

    To identify the optimal technique for closed reduction for shoulder instability, based on success rates, reduction time, complication risks, and pain level. A PubMed and EMBASE query was performed, screening all relevant literature of closed reduction techniques mentioning the success rate written in English, Dutch, German, and Arabic. Studies with a fracture dislocation or lacking information on success rates for closed reduction techniques were excluded. We used the modified Coleman Methodology Score (CMS) to assess the quality of included studies and excluded studies with a poor methodological quality (CMS < 50). Finally, a meta-analysis was performed on the data from all studies combined. 2099 studies were screened for their title and abstract, of which 217 studies were screened full-text and finally 13 studies were included. These studies included 9 randomized controlled trials, 2 retrospective comparative studies, and 2 prospective non-randomized comparative studies. A combined analysis revealed that the scapular manipulation is the most successful (97%), fastest (1.75 min), and least painful reduction technique (VAS 1,47); the "Fast, Reliable, and Safe" (FARES) method also scores high in terms of successful reduction (92%), reduction time (2.24 min), and intra-reduction pain (VAS 1.59); the traction-countertraction technique is highly successful (95%), but slower (6.05 min) and more painful (VAS 4.75). For closed reduction of anterior shoulder dislocations, the combined data from the selected studies indicate that scapular manipulation is the most successful and fastest technique, with the shortest mean hospital stay and least pain during reduction. The FARES method seems the best alternative.

  6. [The progress in speciation analysis of trace elements by atomic spectrometry].

    PubMed

    Wang, Zeng-Huan; Wang, Xu-Nuo; Ke, Chang-Liang; Lin, Qin

    2013-12-01

    The main purpose of the present work is to review the different non-chromatographic methods for the speciation analysis of trace elements in geological, environmental, biological and medical areas. In this paper, the sample processing methods in speciation analysis were summarized, and the main strategies for non-chromatographic technique were evaluated. The basic principles of the liquid extractions proposed in the published literatures recently and their advantages and disadvantages were discussed, such as conventional solvent extraction, cloud point extraction, single droplet microextraction, and dispersive liquid-liquid microextraction. Solid phase extraction, as a non-chromatographic technique for speciation analysis, can be used in batch or in flow detection, and especially suitable for the online connection to atomic spectrometric detector. The developments and applications of sorbent materials filled in the columns of solid phase extraction were reviewed. The sorbents include chelating resins, nanometer materials, molecular and ion imprinted materials, and bio-sorbents. Other techniques, e. g. hydride generation technique and coprecipitation, were also reviewed together with their main applications.

  7. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  8. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  9. Flight testing techniques for the evaluation of light aircraft stability derivatives: A review and analysis

    NASA Technical Reports Server (NTRS)

    Smetana, F. O.; Summery, D. C.; Johnson, W. D.

    1972-01-01

    Techniques quoted in the literature for the extraction of stability derivative information from flight test records are reviewed. A recent technique developed at NASA's Langley Research Center was regarded as the most productive yet developed. Results of tests of the sensitivity of this procedure to various types of data noise and to the accuracy of the estimated values of the derivatives are reported. Computer programs for providing these initial estimates are given. The literature review also includes a discussion of flight test measuring techniques, instrumentation, and piloting techniques.

  10. RAT SPERM MOTILITY ANALYSIS: METHODOLOGICAL CONSIDERATIONS

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer assisted sperm analysis (CASA) of rat epididymal spermatozoa. ethodological issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample ...

  11. Rat sperm motility analysis: methodologic considerations

    EPA Science Inventory

    The objective of these studies was to optimize conditions for computer-assisted sperm analysis (CASA) of rat epididymal spermatozoa. Methodologic issues addressed include sample collection technique, sampling region within the epididymis, type of diluent medium used, and sample c...

  12. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  13. Bladder radiotherapy treatment: A retrospective comparison of 3-dimensional conformal radiotherapy, intensity-modulated radiation therapy, and volumetric-modulated arc therapy plans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pasciuti, Katia, E-mail: k.pasciuti@virgilio.it; Kuthpady, Shrinivas; Anderson, Anne

    To examine tumor's and organ's response when different radiotherapy plan techniques are used. Ten patients with confirmed bladder tumors were first treated using 3-dimensional conformal radiotherapy (3DCRT) and subsequently the original plans were re-optimized using the intensity-modulated radiation treatment (IMRT) and volumetric-modulated arc therapy (VMAT)-techniques. Targets coverage in terms of conformity and homogeneity index, TCP, and organs' dose limits, including integral dose analysis were evaluated. In addition, MUs and treatment delivery times were compared. Better minimum target coverage (1.3%) was observed in VMAT plans when compared to 3DCRT and IMRT ones confirmed by a statistically significant conformity index (CI) results.more » Large differences were observed among techniques in integral dose results of the femoral heads. Even if no statistically significant differences were reported in rectum and tissue, a large amount of energy deposition was observed in 3DCRT plans. In any case, VMAT plans provided better organs and tissue sparing confirmed also by the normal tissue complication probability (NTCP) analysis as well as a better tumor control probability (TCP) result. Our analysis showed better overall results in planning using VMAT techniques. Furthermore, a total time reduction in treatment observed among techniques including gantry and collimator rotation could encourage using the more recent one, reducing target movements and patient discomfort.« less

  14. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Mountain Plains Learning Experience Guide: Marketing. Course: Advanced Salesmanship.

    ERIC Educational Resources Information Center

    Preston, T.; Egan, B.

    One of thirteen individualized courses included in a marketing curriculum, this course covers wholesale and retail selling techniques, sales performance analysis, and intensive sales presentation practice. The course is comprised of four units: (1) Sales Preparation, (2) The Selling Process, (3) Special Selling Techniques, and (4) Sales…

  16. Asking the Right Questions: Techniques for Collaboration and School Change. 2nd Edition.

    ERIC Educational Resources Information Center

    Holcomb, Edie L.

    This work provides school change leaders with tools, techniques, tips, examples, illustrations, and stories about promoting school change. Tools provided include histograms, surveys, run charts, weighted voting, force-field analysis, decision matrices, and many others. Chapter 1, "Introduction," applies a matrix for asking questions…

  17. Collection Evaluation Techniques: A Short, Selective, Practical, Current, Annotated Bibliography, 1990-1998. RUSA Occasional Papers Number 24.

    ERIC Educational Resources Information Center

    Strohl, Bonnie, Comp.

    This bibliography contains annotations of 110 journal articles on topics related to library collection evaluation techniques, including academic library collections, access-vs-ownership, "Books for College Libraries," business collections, the OCLC/AMIGOS Collection Analysis CD, circulation data, citation-checking, collection bias,…

  18. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  19. Practical techniques for enhancing the high-frequency MASW method

    USDA-ARS?s Scientific Manuscript database

    For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...

  20. Paternity testing.

    PubMed

    Onoja, A M

    2011-01-01

    Molecular diagnostic techniques have found application in virtually all areas of medicine, including criminal investigations and forensic analysis. The techniques have become so precise that it is now possible to conclusively determine paternity using DNA from grand parents, cousins, or even saliva left on a discarded cigarette butt. This is a broad overview of paternity testing.

  1. A Streamlined Molecular Biology Module for Undergraduate Biochemistry Labs

    ERIC Educational Resources Information Center

    Muth, Gregory W.; Chihade, Joseph W.

    2008-01-01

    Site-directed mutagenesis and other molecular biology techniques, including plasmid manipulation and restriction analysis, are commonly used tools in the biochemistry research laboratory. In redesigning our biochemistry lab curricula, we sought to integrate these techniques into a term-long, project-based course. In the module presented here,…

  2. Introduction to Psychology and Leadership. Typological Analysis of Student Characteristics: Preliminary Report.

    ERIC Educational Resources Information Center

    Bessemer, David W.; Shrage, Jules H.

    Recommendations for an alternative plan, based on typological analysis techniques, for the evaluation of student characteristics related to media, presentation design, and academic performance are presented. Difficulties with present evaluation plans are discussed, and different methods of typological analysis are described. Included are…

  3. The Use of Neutron Analysis Techniques for Detecting The Concentration And Distribution of Chloride Ions in Archaeological Iron

    PubMed Central

    Watkinson, D; Rimmer, M; Kasztovszky, Z; Kis, Z; Maróti, B; Szentmiklósi, L

    2014-01-01

    Chloride (Cl) ions diffuse into iron objects during burial and drive corrosion after excavation. Located under corrosion layers, Cl is inaccessible to many analytical techniques. Neutron analysis offers non-destructive avenues for determining Cl content and distribution in objects. A pilot study used prompt gamma activation analysis (PGAA) and prompt gamma activation imaging (PGAI) to analyse the bulk concentration and longitudinal distribution of Cl in archaeological iron objects. This correlated with the object corrosion rate measured by oxygen consumption, and compared well with Cl measurement using a specific ion meter. High-Cl areas were linked with visible damage to the corrosion layers and attack of the iron core. Neutron techniques have significant advantages in the analysis of archaeological metals, including penetration depth and low detection limits. PMID:26028670

  4. Design techniques for low-voltage analog integrated circuits

    NASA Astrophysics Data System (ADS)

    Rakús, Matej; Stopjaková, Viera; Arbet, Daniel

    2017-08-01

    In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.

  5. Further Developments of the Fringe-Imaging Skin Friction Technique

    NASA Technical Reports Server (NTRS)

    Zilliac, Gregory C.

    1996-01-01

    Various aspects and extensions of the Fringe-Imaging Skin Friction technique (FISF) have been explored through the use of several benchtop experiments and modeling. The technique has been extended to handle three-dimensional flow fields with mild shear gradients. The optical and imaging system has been refined and a PC-based application has been written that has made it possible to obtain high resolution skin friction field measurements in a reasonable period of time. The improved method was tested on a wingtip and compared with Navier-Stokes computations. Additionally, a general approach to interferogram-fringe spacing analysis has been developed that should have applications in other areas of interferometry. A detailed error analysis of the FISF technique is also included.

  6. Fourier transform infrared microspectroscopy for the analysis of the biochemical composition of C. elegans worms.

    PubMed

    Sheng, Ming; Gorzsás, András; Tuck, Simon

    2016-01-01

    Changes in intermediary metabolism have profound effects on many aspects of C. elegans biology including growth, development and behavior. However, many traditional biochemical techniques for analyzing chemical composition require relatively large amounts of starting material precluding the analysis of mutants that cannot be grown in large amounts as homozygotes. Here we describe a technique for detecting changes in the chemical compositions of C. elegans worms by Fourier transform infrared microspectroscopy. We demonstrate that the technique can be used to detect changes in the relative levels of carbohydrates, proteins and lipids in one and the same worm. We suggest that Fourier transform infrared microspectroscopy represents a useful addition to the arsenal of techniques for metabolic studies of C. elegans worms.

  7. Multidimensional chromatographic techniques for hydrophilic copolymers II. Analysis of poly(ethylene glycol)-poly(vinyl acetate) graft copolymers.

    PubMed

    Knecht, Daniela; Rittig, Frank; Lange, Ronald F M; Pasch, Harald

    2006-10-13

    A large variety of hydrophilic copolymers is applied in different fields of chemical industry including bio, pharma and pharmaceutical applications. For example, poly(ethylene glycol)-poly(vinyl alcohol) graft copolymers that are used as tablet coatings are responsible for the controlled release of the active compounds. These copolymers are produced by grafting of vinyl acetate onto polyethylene glycol (PEG) and subsequent hydrolysis of the poly(ethylene glycol)-poly(vinyl acetate) graft copolymers. The poly(ethylene glycol)-poly(vinyl acetate) copolymers are distributed with regard to molar mass and chemical composition. In addition, they frequently contain the homopolymers polyethylene glycol and polyvinyl acetate. The comprehensive analysis of such complex systems requires hyphenated analytical techniques, including two-dimensional liquid chromatography and combined LC and nuclear magnetic resonance spectroscopy. The development and application of these techniques are discussed in the present paper.

  8. The dynamics and control of large flexible space structures, 6

    NASA Technical Reports Server (NTRS)

    Bainum, P. M.

    1983-01-01

    The controls analysis based on a truncated finite element model of the 122m. Hoop/Column Antenna System focuses on an analysis of the controllability as well as the synthesis of control laws. Graph theoretic techniques are employed to consider controllability for different combinations of number and locations of actuators. Control law synthesis is based on an application of the linear regulator theory as well as pole placement techniques. Placement of an actuator on the hoop can result in a noticeable improvement in the transient characteristics. The problem of orientation and shape control of an orbiting flexible beam, previously examined, is now extended to include the influence of solar radiation environmental forces. For extremely flexible thin structures modification of control laws may be required and techniques for accomplishing this are explained. Effects of environmental torques are also included in previously developed models of orbiting flexible thin platforms.

  9. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  10. Selected environmental plutonium research reports of the NAEG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, M.G.; Dunaway, P.B.

    Twenty-one papers were presented on various aspects of plutonium and radioisotope ecology at the Nevada Test Site. This includes studies of wildlife, microorganisms, and the plant-soil system. Analysis and sampling techniques are also included.

  11. Minimally Invasive Repair of Pectus Excavatum Without Bar Stabilizers Using Endo Close.

    PubMed

    Pio, Luca; Carlucci, Marcello; Leonelli, Lorenzo; Erminio, Giovanni; Mattioli, Girolamo; Torre, Michele

    2016-02-01

    Since the introduction of the Nuss technique for pectus excavatum (PE) repair, stabilization of the bar has been a matter of debate and a crucial point for the outcome, as bar dislocation remains one of the most frequent complications. Several techniques have been described, most of them including the use of a metal stabilizer, which, however, can increase morbidity and be difficult to remove. Our study compares bar stabilization techniques in two groups of patients, respectively, with and without the metal stabilizer. A retrospective study on patients affected by PE and treated by the Nuss technique from January 2012 to June 2013 at our institution was performed in order to evaluate the efficacy of metal stabilizers. Group 1 included patients who did not have the metal stabilizer inserted; stabilization was achieved with multiple (at least four) bilateral pericostal Endo Close™ (Auto Suture, US Surgical; Tyco Healthcare Group, Norwalk, CT) sutures. Group 2 included patients who had a metal stabilizer placed because pericostal sutures could not be used bilaterally. We compared the two groups in terms of bar dislocation rate, surgical operative time, and other complications. Statistical analysis was performed with the Mann-Whitney U test and Fisher's exact test. Fifty-seven patients were included in the study: 37 in Group 1 and 20 in Group 2. Two patients from Group 2 had a bar dislocation. Statistical analysis showed no difference between the two groups in dislocation rate or other complications. In our experience, the placement of a metal stabilizer did not reduce the rate of bar dislocation. Bar stabilization by the pericostal Endo Close suture technique appears to have no increase in morbidity or migration compared with the metal lateral stabilizer technique.

  12. Integrated Application of Active Controls (IAAC) technology to an advanced subsonic transport project: Current and advanced act control system definition study. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Hanks, G. W.; Shomber, H. A.; Dethman, H. A.; Gratzer, L. B.; Maeshiro, A.; Gangsaas, D.; Blight, J. D.; Buchan, S. M.; Crumb, C. B.; Dorwart, R. J.

    1981-01-01

    The current status of the Active Controls Technology (ACT) for the advanced subsonic transport project is investigated through analysis of the systems technical data. Control systems technologies under examination include computerized reliability analysis, pitch axis fly by wire actuator, flaperon actuation system design trade study, control law synthesis and analysis, flutter mode control and gust load alleviation analysis, and implementation of alternative ACT systems. Extensive analysis of the computer techniques involved in each system is included.

  13. Laboratory and quality assurance protocols for the analysis of herbicides in ground water from the Management Systems Evaluation Area, Princeton, Minnesota

    USGS Publications Warehouse

    Larson, S.J.; Capel, P.D.; VanderLoop, A.G.

    1996-01-01

    Laboratory and quality assurance procedures for the analysis of ground-water samples for herbicides at the Management Systems Evaluation Area near Princeton, Minnesota are described. The target herbicides include atrazine, de-ethylatrazine, de-isopropylatrazine, metribuzin, alachlor, 2,6-diethylaniline, and metolachlor. The analytical techniques used are solid-phase extraction, and analysis by gas chromatography with mass-selective detection. Descriptions of cleaning procedures, preparation of standard solutions, isolation of analytes from water, sample transfer methods, instrumental analysis, and data analysis are included.

  14. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  15. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  16. Proceedings of the First National Workshop on the Global Weather Experiment: Current Achievements and Future Directions, volume 2, part 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Topics covered include: data systems and quality; analysis and assimilation techniques; impacts on forecasts; tropical forecasts; analysis intercomparisons; improvements in predictability; and heat sources and sinks.

  17. An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials

    NASA Technical Reports Server (NTRS)

    Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe

    2006-01-01

    Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.

  18. Accelerated Bayesian model-selection and parameter-estimation in continuous gravitational-wave searches with pulsar-timing arrays

    NASA Astrophysics Data System (ADS)

    Taylor, Stephen; Ellis, Justin; Gair, Jonathan

    2014-11-01

    We describe several new techniques which accelerate Bayesian searches for continuous gravitational-wave emission from supermassive black-hole binaries using pulsar-timing arrays. These techniques mitigate the problematic increase of search dimensionality with the size of the pulsar array which arises from having to include an extra parameter per pulsar as the array is expanded. This extra parameter corresponds to searching over the phase of the gravitational wave as it propagates past each pulsar so that we can coherently include the pulsar term in our search strategies. Our techniques make the analysis tractable with powerful evidence-evaluation packages like MultiNest. We find good agreement of our techniques with the parameter-estimation and Bayes factor evaluation performed with full signal templates and conclude that these techniques make excellent first-cut tools for detection and characterization of continuous gravitational-wave signals with pulsar-timing arrays. Crucially, at low to moderate signal-to-noise ratios the factor by which the analysis is sped up can be ≳100 , permitting rigorous programs of systematic injection and recovery of signals to establish robust detection criteria within a Bayesian formalism.

  19. Comparison of femur tunnel aperture location in patients undergoing transtibial and anatomical single-bundle anterior cruciate ligament reconstruction.

    PubMed

    Lee, Dae-Hee; Kim, Hyun-Jung; Ahn, Hyeong-Sik; Bin, Seong-Il

    2016-12-01

    Although three-dimensional computed tomography (3D-CT) has been used to compare femoral tunnel position following transtibial and anatomical anterior cruciate ligament (ACL) reconstruction, no consensus has been reached on which technique results in a more anatomical position because methods of quantifying femoral tunnel position on 3D-CT have not been consistent. This meta-analysis was therefore performed to compare femoral tunnel location following transtibial and anatomical ACL reconstruction, in both the low-to-high and deep-to-shallow directions. This meta-analysis included all studies that used 3D-CT to compare femoral tunnel location, using quadrant or anatomical coordinate axis methods, following transtibial and anatomical (AM portal or OI) single-bundle ACL reconstruction. Six studies were included in the meta-analysis. Femoral tunnel location was 18 % higher in the low-to-high direction, but was not significant in the deep-to-shallow direction, using the transtibial technique than the anatomical methods, when measured using the anatomical coordinate axis method. When measured using the quadrant method, however, femoral tunnel positions were significantly higher (21 %) and shallower (6 %) with transtibial than anatomical methods of ACL reconstruction. The anatomical ACL reconstruction techniques led to a lower femoral tunnel aperture location than the transtibial technique, suggesting the superiority of anatomical techniques for creating new femoral tunnels during revision ACL reconstruction in femoral tunnel aperture location in the low-to-high direction. However, the mean difference in the deep-to-shallow direction differed by method of measurement. Meta-analysis, Level II.

  20. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  1. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  2. 10 CFR 503.34 - Inability to comply with applicable environmental requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... environmental compliance of the facility, including an analysis of its ability to meet applicable standards and... will be based solely on an analysis of the petitioner's capacity to physically achieve applicable... exemption. All such analysis must be based on accepted analytical techniques, such as air quality modeling...

  3. Advanced Navigation Strategies For Asteroid Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.

    2010-01-01

    Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.

  4. [Progress in industrial bioprocess engineering in China].

    PubMed

    Zhuang, Yingping; Chen, Hongzhang; Xia, Jianye; Tang, Wenjun; Zhao, Zhimin

    2015-06-01

    The advances of industrial biotechnology highly depend on the development of industrial bioprocess researches. In China, we are facing several challenges because of a huge national industrial fermentation capacity. The industrial bioprocess development experienced several main stages. This work mainly reviews the development of the industrial bioprocess in China during the past 30 or 40 years: including the early stage kinetics model study derived from classical chemical engineering, researching method based on control theory, multiple-parameter analysis techniques of on-line measuring instruments and techniques, and multi-scale analysis theory, and also solid state fermentation techniques and fermenters. In addition, the cutting edge of bioprocess engineering was also addressed.

  5. Air pollution source identification

    NASA Technical Reports Server (NTRS)

    Fordyce, J. S.

    1975-01-01

    Techniques for air pollution source identification are reviewed, and some results obtained with them are evaluated. Described techniques include remote sensing from satellites and aircraft, on-site monitoring, and the use of injected tracers and pollutants themselves as tracers. The use of a large number of trace elements in ambient airborne particulate matter as a practical means of identifying sources is discussed in detail. Sampling and analysis techniques are described, and it is shown that elemental constituents can be related to specific source types such as those found in the earth's crust and those associated with specific industries. Source identification sytems are noted which utilize charged particle X-ray fluorescence analysis of original field data.

  6. A manual for microcomputer image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rich, P.M.; Ranken, D.M.; George, J.S.

    1989-12-01

    This manual is intended to serve three basic purposes: as a primer in microcomputer image analysis theory and techniques, as a guide to the use of IMAGE{copyright}, a public domain microcomputer program for image analysis, and as a stimulus to encourage programmers to develop microcomputer software suited for scientific use. Topics discussed include the principals of image processing and analysis, use of standard video for input and display, spatial measurement techniques, and the future of microcomputer image analysis. A complete reference guide that lists the commands for IMAGE is provided. IMAGE includes capabilities for digitization, input and output of images,more » hardware display lookup table control, editing, edge detection, histogram calculation, measurement along lines and curves, measurement of areas, examination of intensity values, output of analytical results, conversion between raster and vector formats, and region movement and rescaling. The control structure of IMAGE emphasizes efficiency, precision of measurement, and scientific utility. 18 refs., 18 figs., 2 tabs.« less

  7. Eversion Technique to Prevent Biliary Stricture After Living Donor Liver Transplantation in the Universal Minimal Hilar Dissection Era.

    PubMed

    Ikegami, Toru; Shimagaki, Tomonari; Kawasaki, Junji; Yoshizumi, Tomoharu; Uchiyama, Hideaki; Harada, Noboru; Harimoto, Norifumi; Itoh, Shinji; Soejima, Yuji; Maehara, Yoshihiko

    2017-01-01

    Biliary anastomosis stricture (BAS) is still among the major concerns after living donor liver transplantation (LDLT), even after the technical refinements including the universal use of the blood flow-preserving hilar dissection technique. The aim of this study is to investigate what are still the factors for BAS after LDLT. An analysis of 279 adult-to-adult LDLT grafts (left lobe, n = 161; right lobe, n = 118) with duct-to-duct biliary reconstruction, since the universal application of minimal hilar dissection technique and gradual introduction of eversion technique, was performed. There were 39 patients with BAS. Univariate analysis showed that a right lobe graft (P = 0.008), multiple bile ducts (P < 0.001), ductoplasty (P < 0.001), not using the eversion technique (P = 0.004) and fewer biliary stents than bile duct orifices (P = 0.002) were among the factors associated with BAS. The 1-year and 5-year BAS survival rates were 17.7% and 21.2% in the noneversion group (n = 134), and 6.2% and 7.9% in the eversion group (n = 145), respectively (P = 0.002). The perioperative factors including graft biliary anatomy were not different between everted (n = 134) and noneverted (n = 145) patients. The application of eversion technique under minimal hilar dissection technique could be a key for preventing BAS in duct-to-duct biliary reconstruction in LDLT.

  8. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis.

    PubMed

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-09

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B 1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B 1 affected peanuts at EU regulatory limits of 1250 μg kg -1 and 8 μg kg -1 , respectively.

  9. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    PubMed Central

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-01-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg−1 and 8 μg kg−1, respectively. PMID:28276454

  10. Fabrication of a Dipole-assisted Solid Phase Extraction Microchip for Trace Metal Analysis in Water Samples

    PubMed Central

    Chen, Ping-Hung; Chen, Shun-Niang; Tseng, Sheng-Hao; Deng, Ming-Jay; Lin, Yang-Wei; Sun, Yuh-Chang

    2016-01-01

    This paper describes a fabrication protocol for a dipole-assisted solid phase extraction (SPE) microchip available for trace metal analysis in water samples. A brief overview of the evolution of chip-based SPE techniques is provided. This is followed by an introduction to specific polymeric materials and their role in SPE. To develop an innovative dipole-assisted SPE technique, a chlorine (Cl)-containing SPE functionality was implanted into a poly(methyl methacrylate) (PMMA) microchip. Herein, diverse analytical techniques including contact angle analysis, Raman spectroscopic analysis, and laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) analysis were employed to validate the utility of the implantation protocol of the C-Cl moieties on the PMMA. The analytical results of the X-ray absorption near-edge structure (XANES) analysis also demonstrated the feasibility of the Cl-containing PMMA used as an extraction medium by virtue of the dipole-ion interactions between the highly electronegative C-Cl moieties and the positively charged metal ions. PMID:27584954

  11. Portable Infrared Laser Spectroscopy for On-site Mycotoxin Analysis

    NASA Astrophysics Data System (ADS)

    Sieger, Markus; Kos, Gregor; Sulyok, Michael; Godejohann, Matthias; Krska, Rudolf; Mizaikoff, Boris

    2017-03-01

    Mycotoxins are toxic secondary metabolites of fungi that spoil food, and severely impact human health (e.g., causing cancer). Therefore, the rapid determination of mycotoxin contamination including deoxynivalenol and aflatoxin B1 in food and feed samples is of prime interest for commodity importers and processors. While chromatography-based techniques are well established in laboratory environments, only very few (i.e., mostly immunochemical) techniques exist enabling direct on-site analysis for traders and manufacturers. In this study, we present MYCOSPEC - an innovative approach for spectroscopic mycotoxin contamination analysis at EU regulatory limits for the first time utilizing mid-infrared tunable quantum cascade laser (QCL) spectroscopy. This analysis technique facilitates on-site mycotoxin analysis by combining QCL technology with GaAs/AlGaAs thin-film waveguides. Multivariate data mining strategies (i.e., principal component analysis) enabled the classification of deoxynivalenol-contaminated maize and wheat samples, and of aflatoxin B1 affected peanuts at EU regulatory limits of 1250 μg kg-1 and 8 μg kg-1, respectively.

  12. A Categorization of Dynamic Analyzers

    NASA Technical Reports Server (NTRS)

    Lujan, Michelle R.

    1997-01-01

    Program analysis techniques and tools are essential to the development process because of the support they provide in detecting errors and deficiencies at different phases of development. The types of information rendered through analysis includes the following: statistical measurements of code, type checks, dataflow analysis, consistency checks, test data,verification of code, and debugging information. Analyzers can be broken into two major categories: dynamic and static. Static analyzers examine programs with respect to syntax errors and structural properties., This includes gathering statistical information on program content, such as the number of lines of executable code, source lines. and cyclomatic complexity. In addition, static analyzers provide the ability to check for the consistency of programs with respect to variables. Dynamic analyzers in contrast are dependent on input and the execution of a program providing the ability to find errors that cannot be detected through the use of static analysis alone. Dynamic analysis provides information on the behavior of a program rather than on the syntax. Both types of analysis detect errors in a program, but dynamic analyzers accomplish this through run-time behavior. This paper focuses on the following broad classification of dynamic analyzers: 1) Metrics; 2) Models; and 3) Monitors. Metrics are those analyzers that provide measurement. The next category, models, captures those analyzers that present the state of the program to the user at specified points in time. The last category, monitors, checks specified code based on some criteria. The paper discusses each classification and the techniques that are included under them. In addition, the role of each technique in the software life cycle is discussed. Familiarization with the tools that measure, model and monitor programs provides a framework for understanding the program's dynamic behavior from different, perspectives through analysis of the input/output data.

  13. The dream of a one-stop-shop: Meta-analysis on myocardial perfusion CT.

    PubMed

    Pelgrim, Gert Jan; Dorrius, Monique; Xie, Xueqian; den Dekker, Martijn A M; Schoepf, U Joseph; Henzler, Thomas; Oudkerk, Matthijs; Vliegenthart, Rozemarijn

    2015-12-01

    To determine the diagnostic performance of computed tomography (CT) perfusion techniques for the detection of functionally relevant coronary artery disease (CAD) in comparison to reference standards, including invasive coronary angiography (ICA), single photon emission computed tomography (SPECT), and magnetic resonance imaging (MRI). PubMed, Web of Knowledge and Embase were searched from January 1, 1998 until July 1, 2014. The search yielded 9475 articles. After duplicate removal, 6041 were screened on title and abstract. The resulting 276 articles were independently analyzed in full-text by two reviewers, and included if the inclusion criteria were met. The articles reporting diagnostic parameters including true positive, true negative, false positive and false negative were subsequently evaluated for the meta-analysis. Results were pooled according to CT perfusion technique, namely snapshot techniques: single-phase rest, single-phase stress, single-phase dual-energy stress and combined coronary CT angiography [rest] and single-phase stress, as well the dynamic technique: dynamic stress CT perfusion. Twenty-two articles were included in the meta-analysis (1507 subjects). Pooled per-patient sensitivity and specificity of single-phase rest CT compared to rest SPECT were 89% (95% confidence interval [CI], 82-94%) and 88% (95% CI, 78-94%), respectively. Vessel-based sensitivity and specificity of single-phase stress CT compared to ICA-based >70% stenosis were 82% (95% CI, 64-92%) and 78% (95% CI, 61-89%). Segment-based sensitivity and specificity of single-phase dual-energy stress CT in comparison to stress MRI were 75% (95% CI, 60-85%) and 95% (95% CI, 80-99%). Segment-based sensitivity and specificity of dynamic stress CT perfusion compared to stress SPECT were 77% (95% CI, 67-85) and 89% (95% CI, 78-95%). For combined coronary CT angiography and single-phase stress CT, vessel-based sensitivity and specificity in comparison to ICA-based >50% stenosis were 84% (95% CI, 67-93%) and 93% (95% CI, 89-96%). This meta-analysis shows considerable variation in techniques and reference standards for CT of myocardial blood supply. While CT seems sensitive and specific for evaluation of hemodynamically relevant CAD, studies so far are limited in size. Standardization of myocardial perfusion CT technique is essential. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. The Fabrication Technique and Property Analysis of Racetrack-Type High Temperature Superconducting Magnet for High Power Motor

    NASA Astrophysics Data System (ADS)

    Xie, S. F.; Wang, Y.; Wang, D. Y.; Zhang, X. J.; Zhao, B.; Zhang, Y. Y.; Li, L.; Li, Y. N.; Chen, P. M.

    2013-03-01

    The superconducting motor is now the focus of the research on the application of high temperature superconducting (HTS) materials. In this manuscript, we mainly introduce the recent progress on the fabrication technique and property research of the superconducting motor magnet in Luoyang Ship Material Research Institute (LSMRI) in China, including the materials, the winding and impregnation technique, and property measurement of magnet. Several techniques and devices were developed to manufacture the magnet, including the technique of insulation and thermal conduction, the device for winding the racetrack-type magnet, etc. At last, the superconducting magnet used for the MW class motor were successfully developed, which is the largest superconducting motor magnet in china at present. The critical current of the superconducting magnet exceeds the design value (90 A at 30 K).

  15. Recent applications of multivariate data analysis methods in the authentication of rice and the most analyzed parameters: A review.

    PubMed

    Maione, Camila; Barbosa, Rommel Melgaço

    2018-01-24

    Rice is one of the most important staple foods around the world. Authentication of rice is one of the most addressed concerns in the present literature, which includes recognition of its geographical origin and variety, certification of organic rice and many other issues. Good results have been achieved by multivariate data analysis and data mining techniques when combined with specific parameters for ascertaining authenticity and many other useful characteristics of rice, such as quality, yield and others. This paper brings a review of the recent research projects on discrimination and authentication of rice using multivariate data analysis and data mining techniques. We found that data obtained from image processing, molecular and atomic spectroscopy, elemental fingerprinting, genetic markers, molecular content and others are promising sources of information regarding geographical origin, variety and other aspects of rice, being widely used combined with multivariate data analysis techniques. Principal component analysis and linear discriminant analysis are the preferred methods, but several other data classification techniques such as support vector machines, artificial neural networks and others are also frequently present in some studies and show high performance for discrimination of rice.

  16. Closed-form Static Analysis with Inertia Relief and Displacement-Dependent Loads Using a MSC/NASTRAN DMAP Alter

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.

    1995-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is commonly performed throughout the aerospace industry. Many times, these problems are solved using static analysis with inertia relief. This solution technique allows for a free-free static analysis by balancing the applied loads with inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus displacement-dependent loads. Solving for the final displacements of such systems is commonly performed using iterative solution techniques. Unfortunately, these techniques can be time-consuming and labor-intensive. Since the coupled system equations for free-free systems with displacement-dependent loads can be written in closed-form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. Using a MSC/NASTRAN DMAP Alter, displacement-dependent loads have been included in static analysis with inertia relief. Such an Alter has been used successfully to solve efficiently a common aerospace problem typically solved using an iterative technique.

  17. Novel Passive Clearing Methods for the Rapid Production of Optical Transparency in Whole CNS Tissue.

    PubMed

    Woo, Jiwon; Lee, Eunice Yoojin; Park, Hyo-Suk; Park, Jeong Yoon; Cho, Yong Eun

    2018-05-08

    Since the development of CLARITY, a bioelectrochemical clearing technique that allows for three-dimensional phenotype mapping within transparent tissues, a multitude of novel clearing methodologies including CUBIC (clear, unobstructed brain imaging cocktails and computational analysis), SWITCH (system-wide control of interaction time and kinetics of chemicals), MAP (magnified analysis of the proteome), and PACT (passive clarity technique), have been established to further expand the existing toolkit for the microscopic analysis of biological tissues. The present study aims to improve upon and optimize the original PACT procedure for an array of intact rodent tissues, including the whole central nervous system (CNS), kidneys, spleen, and whole mouse embryos. Termed psPACT (process-separate PACT) and mPACT (modified PACT), these novel techniques provide highly efficacious means of mapping cell circuitry and visualizing subcellular structures in intact normal and pathological tissues. In the following protocol, we provide a detailed, step-by-step outline on how to achieve maximal tissue clearance with minimal invasion of their structural integrity via psPACT and mPACT.

  18. Survey of Analysis of Crime Detection Techniques Using Data Mining and Machine Learning

    NASA Astrophysics Data System (ADS)

    Prabakaran, S.; Mitra, Shilpa

    2018-04-01

    Data mining is the field containing procedures for finding designs or patterns in a huge dataset, it includes strategies at the convergence of machine learning and database framework. It can be applied to various fields like future healthcare, market basket analysis, education, manufacturing engineering, crime investigation etc. Among these, crime investigation is an interesting application to process crime characteristics to help the society for a better living. This paper survey various data mining techniques used in this domain. This study may be helpful in designing new strategies for crime prediction and analysis.

  19. Echocardiographic Evaluation of Left Atrial Mechanics: Function, History, Novel Techniques, Advantages, and Pitfalls.

    PubMed

    Leischik, Roman; Littwitz, Henning; Dworrak, Birgit; Garg, Pankaj; Zhu, Meihua; Sahn, David J; Horlitz, Marc

    2015-01-01

    Left atrial (LA) functional analysis has an established role in assessing left ventricular diastolic function. The current standard echocardiographic parameters used to study left ventricular diastolic function include pulsed-wave Doppler mitral inflow analysis, tissue Doppler imaging measurements, and LA dimension estimation. However, the above-mentioned parameters do not directly quantify LA performance. Deformation studies using strain and strain-rate imaging to assess LA function were validated in previous research, but this technique is not currently used in routine clinical practice. This review discusses the history, importance, and pitfalls of strain technology for the analysis of LA mechanics.

  20. Detection of Genetically Modified Sugarcane by Using Terahertz Spectroscopy and Chemometrics

    NASA Astrophysics Data System (ADS)

    Liu, J.; Xie, H.; Zha, B.; Ding, W.; Luo, J.; Hu, C.

    2018-03-01

    A methodology is proposed to identify genetically modified sugarcane from non-genetically modified sugarcane by using terahertz spectroscopy and chemometrics techniques, including linear discriminant analysis (LDA), support vector machine-discriminant analysis (SVM-DA), and partial least squares-discriminant analysis (PLS-DA). The classification rate of the above mentioned methods is compared, and different types of preprocessing are considered. According to the experimental results, the best option is PLS-DA, with an identification rate of 98%. The results indicated that THz spectroscopy and chemometrics techniques are a powerful tool to identify genetically modified and non-genetically modified sugarcane.

  1. Impact of novel techniques on minimally invasive adrenal surgery: trends and outcomes from a contemporary international large series in urology.

    PubMed

    Pavan, Nicola; Autorino, Riccardo; Lee, Hak; Porpiglia, Francesco; Sun, Yinghao; Greco, Francesco; Jeff Chueh, S; Han, Deok Hyun; Cindolo, Luca; Ferro, Matteo; Chen, Xiang; Branco, Anibal; Fornara, Paolo; Liao, Chun-Hou; Miyajima, Akira; Kyriazis, Iason; Puglisi, Marco; Fiori, Cristian; Yang, Bo; Fei, Guo; Altieri, Vincenzo; Jeong, Byong Chang; Berardinelli, Francesco; Schips, Luigi; De Cobelli, Ottavio; Chen, Zhi; Haber, Georges-Pascal; He, Yao; Oya, Mototsugu; Liatsikos, Evangelos; Brandao, Luis; Challacombe, Benjamin; Kaouk, Jihad; Darweesh, Ithaar

    2016-10-01

    To evaluate contemporary international trends in the implementation of minimally invasive adrenalectomy and to assess contemporary outcomes of different minimally invasive techniques performed at urologic centers worldwide. A retrospective multinational multicenter study of patients who underwent minimally invasive adrenalectomy from 2008 to 2013 at 14 urology institutions worldwide was included in the analysis. Cases were categorized based on the minimally invasive adrenalectomy technique: conventional laparoscopy (CL), robot-assisted laparoscopy (RAL), laparoendoscopic single-site surgery (LESS), and mini-laparoscopy (ML). The rates of the four treatment modalities were determined according to the year of surgery, and a regression analysis was performed for trends in all surgical modalities. Overall, a total of 737 adrenalectomies were performed across participating institutions and included in this analysis: 337 CL (46 % of cases), 57 ML (8 %), 263 LESS (36 %), and 80 RA (11 %). Overall, 204 (28 %) operations were performed with a retroperitoneal approach. The overall number of adrenalectomies increased from 2008 to 2013 (p = 0.05). A transperitoneal approach was preferred in all but the ML group (p < 0.001). European centers mostly adopted CL and ML techniques, whereas those from Asia and South America reported the highest rate in LESS procedures, and RAL was adopted to larger extent in the USA. LESS had the fastest increase in utilization at 6 %/year. The rate of RAL procedures increased at slower rates (2.2 %/year), similar to ML (1.7 %/year). Limitations of this study are the retrospective design and the lack of a cost analysis. Several minimally invasive surgical techniques for the management of adrenal masses are successfully implemented in urology institutions worldwide. CL and LESS seem to represent the most commonly adopted techniques, whereas ML and RAL are growing at a slower rate. All the MIS techniques can be safely and effectively performed for a variety of adrenal disease.

  2. WAATS: A computer program for Weights Analysis of Advanced Transportation Systems

    NASA Technical Reports Server (NTRS)

    Glatt, C. R.

    1974-01-01

    A historical weight estimating technique for advanced transportation systems is presented. The classical approach to weight estimation is discussed and sufficient data is presented to estimate weights for a large spectrum of flight vehicles including horizontal and vertical takeoff aircraft, boosters and reentry vehicles. A computer program, WAATS (Weights Analysis for Advanced Transportation Systems) embracing the techniques discussed has been written and user instructions are presented. The program was developed for use in the ODIN (Optimal Design Integration System) system.

  3. Battery Test Manual For 48 Volt Mild Hybrid Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Lee Kenneth

    2017-03-01

    This manual details the U.S. Advanced Battery Consortium and U.S. Department of Energy Vehicle Technologies Program goals, test methods, and analysis techniques for a 48 Volt Mild Hybrid Electric Vehicle system. The test methods are outlined stating with characterization tests, followed by life tests. The final section details standardized analysis techniques for 48 V systems that allow for the comparison of different programs that use this manual. An example test plan is included, along with guidance to filling in gap table numbers.

  4. Analysis of the Apollo spacecraft operational data management system. Executive summary

    NASA Technical Reports Server (NTRS)

    1971-01-01

    A study was made of Apollo, Skylab, and several other data management systems to determine those techniques which could be applied to the management of operational data for future manned spacecraft programs. The results of the study are presented and include: (1) an analysis of present data management systems, (2) a list of requirements for future operational data management systems, (3) an evaluation of automated data management techniques, and (4) a plan for data management applicable to future space programs.

  5. Analysing uncertainties of supply and demand in the future use of hydrogen as an energy vector

    NASA Astrophysics Data System (ADS)

    Lenel, U. R.; Davies, D. G. S.; Moore, M. A.

    An analytical technique (Analysis with Uncertain Qualities), developed at Fulmer, is being used to examine the sensitivity of the outcome to uncertainties in input quantities in order to highlight which input quantities critically affect the potential role of hydrogen. The work presented here includes an outline of the model and the analysis technique, along with basic considerations of the input quantities to the model (demand, supply and constraints). Some examples are given of probabilistic estimates of input quantities.

  6. Surgical interventions for meniscal tears: a closer look at the evidence.

    PubMed

    Mutsaerts, Eduard L A R; van Eck, Carola F; van de Graaf, Victor A; Doornberg, Job N; van den Bekerom, Michel P J

    2016-03-01

    The aim of the present study was to compare the outcomes of various surgical treatments for meniscal injuries including (1) total and partial meniscectomy; (2) meniscectomy and meniscal repair; (3) meniscectomy and meniscal transplantation; (4) open and arthroscopic meniscectomy and (5) various different repair techniques. The Bone, Joint and Muscle Trauma Group Register, Cochrane Database, MEDLINE, EMBASE and CINAHL were searched for all (quasi) randomized controlled clinical trials comparing various surgical techniques for meniscal injuries. Primary outcomes of interest included patient-reported outcomes scores, return to pre-injury activity level, level of sports participation and persistence of pain using the visual analogue score. Where possible, data were pooled and a meta-analysis was performed. A total of nine studies were included, involving a combined 904 subjects, 330 patients underwent a meniscal repair, 402 meniscectomy and 160 a collagen meniscal implant. The only surgical treatments that were compared in homogeneous fashion across more than one study were the arrow and inside-out technique, which showed no difference for re-tear or complication rate. Strong evidence-based recommendations regarding the other surgical treatments that were compared could not be made. This meta-analysis illustrates the lack of level I evidence to guide the surgical management of meniscal tears. Level I meta-analysis.

  7. Systems Analysis in Small Educational Systems: A Case Study.

    ERIC Educational Resources Information Center

    Vazquez-Abad, Jesus; And Others

    1982-01-01

    The use of systems analysis in transforming a graduate program in educational technology from a lecture-based system to a self-instructional one is described. Several operational research techniques are illustrated. A bibliography of 10 items is included. (CHC)

  8. Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions

    ERIC Educational Resources Information Center

    Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.

    2006-01-01

    In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…

  9. CMOS array design automation techniques. [metal oxide semiconductors

    NASA Technical Reports Server (NTRS)

    Ramondetta, P.; Feller, A.; Noto, R.; Lombardi, T.

    1975-01-01

    A low cost, quick turnaround technique for generating custom metal oxide semiconductor arrays using the standard cell approach was developed, implemented, tested and validated. Basic cell design topology and guidelines are defined based on an extensive analysis that includes circuit, layout, process, array topology and required performance considerations particularly high circuit speed.

  10. Information Landscaping: Information Mapping, Charting, Querying and Reporting Techniques for Total Quality Knowledge Management.

    ERIC Educational Resources Information Center

    Tsai, Bor-sheng

    2003-01-01

    Total quality management and knowledge management are merged and used as a conceptual model to direct and develop information landscaping techniques through the coordination of information mapping, charting, querying, and reporting. Goals included: merge citation analysis and data mining, and apply data visualization and information architecture…

  11. Chemical Fingerprinting of Materials Developed Due To Environmental Issues

    NASA Technical Reports Server (NTRS)

    Smith, Doris A.; McCool, A. (Technical Monitor)

    2000-01-01

    This paper presents viewgraphs on chemical fingerprinting of materials developed due to environmental issues. Some of the topics include: 1) Aerospace Materials; 2) Building Blocks of Capabilities; 3) Spectroscopic Techniques; 4) Chromatographic Techniques; 5) Factors that Determine Fingerprinting Approach; and 6) Fingerprinting: Combination of instrumental analysis methods that diagnostically characterize a material.

  12. Proceedings of the 1990 IPMAAC Conference on Personnel Assessment (14th, San Diego, California, June 24-28, 1990).

    ERIC Educational Resources Information Center

    International Personnel Management Association, Washington, DC.

    Fifty-seven papers presented at the annual meeting of the International Personnel Management Association Assessment Council (IPMAAC) in 1990 are provided. Selected topics include: using the cloze technique for reading skills assessment; examining assessment techniques; job analysis; alternate strategies for assessing writing skills; assessment of…

  13. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  14. Understanding a Normal Distribution of Data (Part 2).

    PubMed

    Maltenfort, Mitchell

    2016-02-01

    Completing the discussion of data normality, advanced techniques for analysis of non-normal data are discussed including data transformation, Generalized Linear Modeling, and bootstrapping. Relative strengths and weaknesses of each technique are helpful in choosing a strategy, but help from a statistician is usually necessary to analyze non-normal data using these methods.

  15. A Phase-Only technique for enhancing the high-frequency MASW method

    USDA-ARS?s Scientific Manuscript database

    For soil exploration in the vadose zone, a high-frequency multi-channel analysis of surface waves (HF-MASW) method has been developed. In the study, several practical techniques were applied to enhance the overtone image of the HF-MASW method. They included (1) the self-adaptive MASW method using a ...

  16. Discovering Authorities and Hubs in Different Topological Web Graph Structures.

    ERIC Educational Resources Information Center

    Meghabghab, George

    2002-01-01

    Discussion of citation analysis on the Web considers Web hyperlinks as a source to analyze citations. Topics include basic graph theory applied to Web pages, including matrices, linear algebra, and Web topology; and hubs and authorities, including a search technique called HITS (Hyperlink Induced Topic Search). (Author/LRW)

  17. Uranium Detection - Technique Validation Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.

    As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less

  18. Physics faculty beliefs and values about the teaching and learning of problem solving. II. Procedures for measurement and analysis

    NASA Astrophysics Data System (ADS)

    Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia

    2007-12-01

    To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.

  19. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    NASA Technical Reports Server (NTRS)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  20. Logic Model Checking of Unintended Acceleration Claims in the 2005 Toyota Camry Electronic Throttle Control System

    NASA Technical Reports Server (NTRS)

    Gamble, Ed; Holzmann, Gerard

    2011-01-01

    Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses

  1. Microwave techniques for measuring complex permittivity and permeability of materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guillon, P.

    1995-08-01

    Different materials are of fundamental importance to the aerospace, microwave, electronics and communications industries, and include for example microwave absorbing materials, antennas lenses and radomes, substrates for MMIC and microwave components and antennaes. Basic measurements for the complex permittivity and permeability of those homogeneous solid materials in the microwave spectral region are described including hardware, instrumentation and analysis. Elevated temperature measurements as well as measurements intercomparisons, with a discussion of the strengths and weaknesses of each techniques are also presented.

  2. Overview: MURI Center on spectroscopic and time domain detection of trace explosives in condensed and vapor phases

    NASA Astrophysics Data System (ADS)

    Spicer, James B.; Dagdigian, Paul; Osiander, Robert; Miragliotta, Joseph A.; Zhang, Xi-Cheng; Kersting, Roland; Crosley, David R.; Hanson, Ronald K.; Jeffries, Jay

    2003-09-01

    The research center established by Army Research Office under the Multidisciplinary University Research Initiative program pursues a multidisciplinary approach to investigate and advance the use of complementary analytical techniques for sensing of explosives and/or explosive-related compounds as they occur in the environment. The techniques being investigated include Terahertz (THz) imaging and spectroscopy, Laser-Induced Breakdown Spectroscopy (LIBS), Cavity Ring Down Spectroscopy (CRDS) and Resonance Enhanced Multiphoton Ionization (REMPI). This suite of techniques encompasses a diversity of sensing approaches that can be applied to detection of explosives in condensed phases such as adsorbed species in soil or can be used for vapor phase detection above the source. Some techniques allow for remote detection while others have highly specific and sensitive analysis capabilities. This program is addressing a range of fundamental, technical issues associated with trace detection of explosive related compounds using these techniques. For example, while both LIBS and THz can be used to carry-out remote analysis of condensed phase analyte from a distance in excess several meters, the sensitivities of these techniques to surface adsorbed explosive-related compounds are not currently known. In current implementations, both CRDS and REMPI require sample collection techniques that have not been optimized for environmental applications. Early program elements will pursue the fundamental advances required for these techniques including signature identification for explosive-related compounds/interferents and trace analyte extraction. Later program tasks will explore simultaneous application of two or more techniques to assess the benefits of sensor fusion.

  3. Theory and design of variable conductance heat pipes

    NASA Technical Reports Server (NTRS)

    Marcus, B. D.

    1972-01-01

    A comprehensive review and analysis of all aspects of heat pipe technology pertinent to the design of self-controlled, variable conductance devices for spacecraft thermal control is presented. Subjects considered include hydrostatics, hydrodynamics, heat transfer into and out of the pipe, fluid selection, materials compatibility and variable conductance control techniques. The report includes a selected bibliography of pertinent literature, analytical formulations of various models and theories describing variable conductance heat pipe behavior, and the results of numerous experiments on the steady state and transient performance of gas controlled variable conductance heat pipes. Also included is a discussion of VCHP design techniques.

  4. Recent Methodology in Ginseng Analysis

    PubMed Central

    Baek, Seung-Hoon; Bae, Ok-Nam; Park, Jeong Hill

    2012-01-01

    As much as the popularity of ginseng in herbal prescriptions or remedies, ginseng has become the focus of research in many scientific fields. Analytical methodologies for ginseng, referred to as ginseng analysis hereafter, have been developed for bioactive component discovery, phytochemical profiling, quality control, and pharmacokinetic studies. This review summarizes the most recent advances in ginseng analysis in the past half-decade including emerging techniques and analytical trends. Ginseng analysis includes all of the leading analytical tools and serves as a representative model for the analytical research of herbal medicines. PMID:23717112

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucia, M., E-mail: mlucia@pppl.gov; Kaita, R.; Majeski, R.

    The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.

  6. Fallon, Nevada FORGE Fluid Geochemistry

    DOE Data Explorer

    Blankenship, Doug; Ayling, Bridget

    2018-03-13

    Fluid geochemistry analysis for wells supporting the Fallon FORGE project. Samples were collected from geothermal wells using standard geothermal water sampling techniques, including filtration and acidification of the cation sample to pH < 2 prior to geochemical analysis. Analyses after 2005 were done in reputable commercial laboratories that follow standard protocols for aqueous chemistry analysis.

  7. Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.

    ERIC Educational Resources Information Center

    Lampe, Marc

    1997-01-01

    Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)

  8. [Multi-channel in vivo recording techniques: signal processing of action potentials and local field potentials].

    PubMed

    Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian

    2014-06-25

    Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.

  9. Measurement techniques for trace metals in coal-plant effluents: A brief review

    NASA Technical Reports Server (NTRS)

    Singh, J. J.

    1979-01-01

    The strong features and limitations of techniques for determining trace elements in aerosols emitted from coal plants are discussed. Techniques reviewed include atomic absorption spectroscopy, charged particle scattering and activation, instrumental neutron activation analysis, gas/liquid chromatography, gas chromatographic/mass spectrometric methods, X-ray fluorescence, and charged-particle-induced X-ray emission. The latter two methods are emphasized. They provide simultaneous, sensitive multielement analyses and lend themselves readily to depth profiling. It is recommended that whenever feasible, two or more complementary techniques should be used for analyzing environmental samples.

  10. Computed tomography for non-destructive evaluation of composites: Applications and correlations

    NASA Technical Reports Server (NTRS)

    Goldberg, B.; Hediger, L.; Noel, E.

    1985-01-01

    The state-of-the-art fabrication techniques for composite materials are such that stringent species-specific acceptance criteria must be generated to insure product reliability. Non-destructive evaluation techniques including computed tomography (CT), X-ray radiography (RT), and ultrasonic scanning (UT) are investigated and compared to determine their applicability and limitations to graphite epoxy, carbon-carbon, and carbon-phenolic materials. While the techniques appear complementary, CT is shown to provide significant, heretofore unattainable data. Finally, a correlation of NDE techniques to destructive analysis is presented.

  11. Graphic/symbol segmentation for Group 4 facsimile systems

    NASA Astrophysics Data System (ADS)

    Deutermann, A. R.

    1982-04-01

    The purpose of this study was to examine possible techniques for and symbol areas, and assemble a code that represents the entire document. Parameters to be considered include compression, commonality with facsimile and TELETEX* transmissions, and complexity of implementation. Six segmentation technique were selected for analysis. The techniques were designed to differ from each other as much as possible, so as to display a wide variety of characteristics. For each technique, many minor modifications would be possible, but it is not expected that these modifications would alter the conclusions drawn from the study.

  12. Rocket nozzle thermal shock tests in an arc heater facility

    NASA Technical Reports Server (NTRS)

    Painter, James H.; Williamson, Ronald A.

    1986-01-01

    A rocket motor nozzle thermal structural test technique that utilizes arc heated nitrogen to simulate a motor burn was developed. The technique was used to test four heavily instrumented full-scale Star 48 rocket motor 2D carbon/carbon segments at conditions simulating the predicted thermal-structural environment. All four nozzles survived the tests without catastrophic or other structural failures. The test technique demonstrated promise as a low cost, controllable alternative to rocket motor firing. The technique includes the capability of rapid termination in the event of failure, allowing post-test analysis.

  13. SeeSway - A free web-based system for analysing and exploring standing balance data.

    PubMed

    Clark, Ross A; Pua, Yong-Hao

    2018-06-01

    Computerised posturography can be used to assess standing balance, and can predict poor functional outcomes in many clinical populations. A key limitation is the disparate signal filtering and analysis techniques, with many methods requiring custom computer programs. This paper discusses the creation of a freely available web-based software program, SeeSway (www.rehabtools.org/seesway), which was designed to provide powerful tools for pre-processing, analysing and visualising standing balance data in an easy to use and platform independent website. SeeSway links an interactive web platform with file upload capability to software systems including LabVIEW, Matlab, Python and R to perform the data filtering, analysis and visualisation of standing balance data. Input data can consist of any signal that comprises an anterior-posterior and medial-lateral coordinate trace such as center of pressure or mass displacement. This allows it to be used with systems including criterion reference commercial force platforms and three dimensional motion analysis, smartphones, accelerometers and low-cost technology such as Nintendo Wii Balance Board and Microsoft Kinect. Filtering options include Butterworth, weighted and unweighted moving average, and discrete wavelet transforms. Analysis methods include standard techniques such as path length, amplitude, and root mean square in addition to less common but potentially promising methods such as sample entropy, detrended fluctuation analysis and multiresolution wavelet analysis. These data are visualised using scalograms, which chart the change in frequency content over time, scatterplots and standard line charts. This provides the user with a detailed understanding of their results, and how their different pre-processing and analysis method selections affect their findings. An example of the data analysis techniques is provided in the paper, with graphical representation of how advanced analysis methods can better discriminate between someone with neurological impairment and a healthy control. The goal of SeeSway is to provide a simple yet powerful educational and research tool to explore how standing balance is affected in aging and clinical populations. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Supercritical fluid chromatography for lipid analysis in foodstuffs.

    PubMed

    Donato, Paola; Inferrera, Veronica; Sciarrone, Danilo; Mondello, Luigi

    2017-01-01

    The task of lipid analysis has always challenged separation scientists, and new techniques in chromatography were often developed for the separation of lipids; however, no single technique or methodology is yet capable of affording a comprehensive screening of all lipid species and classes. This review acquaints the role of supercritical fluid chromatography within the field of lipid analysis, from the early developed capillary separations based on pure CO 2 , to the most recent techniques employing packed columns under subcritical conditions, including the niche multidimensional techniques using supercritical fluids in at least one of the separation dimensions. A short history of supercritical fluid chromatography will be introduced first, from its early popularity in the late 1980s, to the sudden fall and oblivion until the last decade, experiencing a regain of interest within the chromatographic community. Afterwards, the subject of lipid nomenclature and classification will be briefly dealt with, before discussing the main applications of supercritical fluid chromatography for food analysis, according to the specific class of lipids. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Diagnostic Chemical Analysis of Exhaled Human Breath Using a Novel Sub-Millimeter Spectroscopic Approach

    NASA Astrophysics Data System (ADS)

    Fosnight, Alyssa M.; Moran, Benjamin L.; Branco, Daniela R.; Thomas, Jessica R.; Medvedev, Ivan R.

    2013-06-01

    As many as 3000 chemicals are reported to be found in exhaled human breath. Many of these chemicals are linked to certain health conditions and environmental exposures. Present state of the art techniques used for analysis of exhaled human breath include mass spectrometry based methods, infrared spectroscopic sensors, electro chemical sensors and semiconductor oxide based testers. Some of these techniques are commercially available but are somewhat limited in their specificity and exhibit fairly high probability of false alarm. Here, we present the results of our most recent study which demonstrated a novel application of a terahertz high resolutions spectroscopic technique to the analysis of exhaled human breath, focused on detection of ethanol in the exhaled breath of a person which consumed an alcoholic drink. This technique possesses nearly ``absolute'' specificity and we demonstrated its ability to uniquely identify ethanol, methanol, and acetone in human breath. This project is now complete and we are looking to extend this method of chemical analysis of exhaled human breath to a broader range of chemicals in an attempt to demonstrate its potential for biomedical diagnostic purposes.

  16. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  17. Processing and analysis techniques involving in-vessel material generation

    DOEpatents

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  18. Processing and analysis techniques involving in-vessel material generation

    DOEpatents

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2012-09-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  19. D-region blunt probe data analysis using hybrid computer techniques

    NASA Technical Reports Server (NTRS)

    Burkhard, W. J.

    1973-01-01

    The feasibility of performing data reduction techniques with a hybrid computer was studied. The data was obtained from the flight of a parachute born probe through the D-region of the ionosphere. A presentation of the theory of blunt probe operation is included with emphasis on the equations necessary to perform the analysis. This is followed by a discussion of computer program development. Included in this discussion is a comparison of computer and hand reduction results for the blunt probe launched on 31 January 1972. The comparison showed that it was both feasible and desirable to use the computer for data reduction. The results of computer data reduction performed on flight data acquired from five blunt probes are also presented.

  20. Expert systems in civil engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostem, C.N.; Maher, M.L.

    1986-01-01

    This book presents the papers given at a symposium on expert systems in civil engineering. Topics considered at the symposium included problem solving using expert system techniques, construction schedule analysis, decision making and risk analysis, seismic risk analysis systems, an expert system for inactive hazardous waste site characterization, an expert system for site selection, knowledge engineering, and knowledge-based expert systems in seismic analysis.

  1. A program to form a multidisciplinary data base and analysis for dynamic systems

    NASA Technical Reports Server (NTRS)

    Taylor, L. W.; Suit, W. T.; Mayo, M. H.

    1984-01-01

    Diverse sets of experimental data and analysis programs have been assembled for the purpose of facilitating research in systems identification, parameter estimation and state estimation techniques. The data base analysis programs are organized to make it easy to compare alternative approaches. Additional data and alternative forms of analysis will be included as they become available.

  2. Medium Caliber Lead-Free Electric Primer. Version 2

    DTIC Science & Technology

    2012-09-01

    Toxic Substance Control Act TGA Thermogravimetric Analysis TNR Trinitroresorcinol V Voltage VDC Voltage Direct Current WSESRB Weapons System...variety of techniques including Thermogravimetric Analysis (TGA), base-hydrolysis, Surface Area Analysis using Brunauer, Emmett, Teller (BET...Distribution From Thermogravimetric Analysis Johnson, C. E.; Fallis, S.; Chafin, A. P.; Groshens, T. J.; Higa, K. T.; Ismail, I. M. K. and Hawkins, T. W

  3. Approximate techniques of structural reanalysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lowder, H. E.

    1974-01-01

    A study is made of two approximate techniques for structural reanalysis. These include Taylor series expansions for response variables in terms of design variables and the reduced-basis method. In addition, modifications to these techniques are proposed to overcome some of their major drawbacks. The modifications include a rational approach to the selection of the reduced-basis vectors and the use of Taylor series approximation in an iterative process. For the reduced basis a normalized set of vectors is chosen which consists of the original analyzed design and the first-order sensitivity analysis vectors. The use of the Taylor series approximation as a first (initial) estimate in an iterative process, can lead to significant improvements in accuracy, even with one iteration cycle. Therefore, the range of applicability of the reanalysis technique can be extended. Numerical examples are presented which demonstrate the gain in accuracy obtained by using the proposed modification techniques, for a wide range of variations in the design variables.

  4. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  5. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  6. Use of communication techniques by Maryland dentists.

    PubMed

    Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V

    2013-12-01

    Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.

  7. Land Use Management for Solid Waste Programs

    ERIC Educational Resources Information Center

    Brown, Sanford M., Jr.

    1974-01-01

    The author discusses the problems of solid waste disposal and examines various land use management techniques. These include the land use plan, zoning, regionalization, land utilities, and interim use. Information concerning solid waste processing site zoning and analysis is given. Bibliography included. (MA)

  8. Technical success, technique efficacy and complications of minimally-invasive imaging-guided percutaneous ablation procedures of breast cancer: A systematic review and meta-analysis.

    PubMed

    Mauri, Giovanni; Sconfienza, Luca Maria; Pescatori, Lorenzo Carlo; Fedeli, Maria Paola; Alì, Marco; Di Leo, Giovanni; Sardanelli, Francesco

    2017-08-01

    To systematically review studies concerning imaging-guided minimally-invasive breast cancer treatments. An online database search was performed for English-language articles evaluating percutaneous breast cancer ablation. Pooled data and 95% confidence intervals (CIs) were calculated. Technical success, technique efficacy, minor and major complications were analysed, including ablation technique subgroup analysis and effect of tumour size on outcome. Forty-five studies were analysed, including 1,156 patients and 1,168 lesions. Radiofrequency (n=577; 50%), microwaves (n=78; 7%), laser (n=227; 19%), cryoablation (n=156; 13%) and high-intensity focused ultrasound (HIFU, n=129; 11%) were used. Pooled technical success was 96% (95%CI 94-97%) [laser=98% (95-99%); HIFU=96% (90-98%); radiofrequency=96% (93-97%); cryoablation=95% (90-98%); microwave=93% (81-98%)]. Pooled technique efficacy was 75% (67-81%) [radiofrequency=82% (74-88); cryoablation=75% (51-90); laser=59% (35-79); HIFU=49% (26-74)]. Major complications pooled rate was 6% (4-8). Minor complications pooled rate was 8% (5-13%). Differences between techniques were not significant for technical success (p=0.449), major complications (p=0.181) or minor complications (p=0.762), but significant for technique efficacy (p=0.009). Tumour size did not impact on variables (p>0.142). Imaging-guided percutaneous ablation techniques of breast cancer have a high rate of technical success, while technique efficacy remains suboptimal. Complication rates are relatively low. • Imaging-guided ablation techniques for breast cancer are 96% technically successful. • Overall technique efficacy rate is 75% but largely inhomogeneous among studies. • Overall major and minor complication rates are low (6-8%).

  9. Spatial interpolation of forest conditions using co-conditional geostatistical simulation

    Treesearch

    H. Todd Mowrer

    2000-01-01

    In recent work the author used the geostatistical Monte Carlo technique of sequential Gaussian simulation (s.G.s.) to investigate uncertainty in a GIS analysis of potential old-growth forest areas. The current study compares this earlier technique to that of co-conditional simulation, wherein the spatial cross-correlations between variables are included. As in the...

  10. Nationwide forestry applications program. Analysis of forest classification accuracy

    NASA Technical Reports Server (NTRS)

    Congalton, R. G.; Mead, R. A.; Oderwald, R. G.; Heinen, J. (Principal Investigator)

    1981-01-01

    The development of LANDSAT classification accuracy assessment techniques, and of a computerized system for assessing wildlife habitat from land cover maps are considered. A literature review on accuracy assessment techniques and an explanation for the techniques development under both projects are included along with listings of the computer programs. The presentations and discussions at the National Working Conference on LANDSAT Classification Accuracy are summarized. Two symposium papers which were published on the results of this project are appended.

  11. Empowerment in Latina Immigrant Women Recovering From Interpersonal Violence: A Concept Analysis.

    PubMed

    Page, Robin L; Chilton, Jenifer; Montalvo-Liendo, Nora; Matthews, Debra; Nava, Angeles

    2017-04-01

    Latina immigrant women are vulnerable and may experience higher levels of interpersonal or intimate partner violence (IPV) due to their immigrant status and cultural emphasis on familism. The concept of empowerment within the cultural context of Latina immigrant women experiencing IPV was analyzed using a modified version of Walker and Avant's concept analysis technique. The technique considers usage and definitions in the literature, antecedents, attributes, empirical referents, and the inclusion of a model and contrary case. This analysis encompasses a comparative approach and includes a discussion of how the definition of empowerment compares across the nursing literature. Defining attributes include reciprocal relationships, autonomy, and accountability. Antecedents comprise willingness to learn and motivation to create change. Consequences encompass self-esteem, self-efficacy, and competence for making life decisions. Empowerment has the potential to improve total well-being, having a positive and profound impact on the lives of women experiencing IPV.

  12. Code Analysis and Refactoring with Clang Tools, Version 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelley, Timothy M.

    2016-12-23

    Code Analysis and Refactoring with Clang Tools is a small set of example code that demonstrates techniques for applying tools distributed with the open source Clang compiler. Examples include analyzing where variables are used and replacing old data structures with standard structures.

  13. Microscopic Analysis of Activated Sludge. Training Manual.

    ERIC Educational Resources Information Center

    Office of Water Program Operations (EPA), Cincinnati, OH. National Training and Operational Technology Center.

    This training manual presents material on the use of a compound microscope to analyze microscope communities, present in wastewater treatment processes, for operational control. Course topics include: sampling techniques, sample handling, laboratory analysis, identification of organisms, data interpretation, and use of the compound microscope.…

  14. An advanced software suite for the processing and analysis of silicon luminescence images

    NASA Astrophysics Data System (ADS)

    Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.

    2017-06-01

    Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.

  15. Array magnetics modal analysis for the DIII-D tokamak based on localized time-series modelling

    DOE PAGES

    Olofsson, K. Erik J.; Hanson, Jeremy M.; Shiraki, Daisuke; ...

    2014-07-14

    Here, time-series analysis of magnetics data in tokamaks is typically done using block-based fast Fourier transform methods. This work presents the development and deployment of a new set of algorithms for magnetic probe array analysis. The method is based on an estimation technique known as stochastic subspace identification (SSI). Compared with the standard coherence approach or the direct singular value decomposition approach, the new technique exhibits several beneficial properties. For example, the SSI method does not require that frequencies are orthogonal with respect to the timeframe used in the analysis. Frequencies are obtained directly as parameters of localized time-series models.more » The parameters are extracted by solving small-scale eigenvalue problems. Applications include maximum-likelihood regularized eigenmode pattern estimation, detection of neoclassical tearing modes, including locked mode precursors, and automatic clustering of modes, and magnetics-pattern characterization of sawtooth pre- and postcursors, edge harmonic oscillations and fishbones.« less

  16. Elemental imaging at the nanoscale: NanoSIMS and complementary techniques for element localisation in plants.

    PubMed

    Moore, Katie L; Lombi, Enzo; Zhao, Fang-Jie; Grovenor, Chris R M

    2012-04-01

    The ability to locate and quantify elemental distributions in plants is crucial to understanding plant metabolisms, the mechanisms of uptake and transport of minerals and how plants cope with toxic elements or elemental deficiencies. High-resolution secondary ion mass spectrometry (SIMS) is emerging as an important technique for the analysis of biological material at the subcellular scale. This article reviews recent work using the CAMECA NanoSIMS to determine elemental distributions in plants. The NanoSIMS is able to map elemental distributions at high resolution, down to 50 nm, and can detect very low concentrations (milligrams per kilogram) for some elements. It is also capable of mapping almost all elements in the periodic table (from hydrogen to uranium) and can distinguish between stable isotopes, which allows the design of tracer experiments. In this review, particular focus is placed upon studying the same or similar specimens with both the NanoSIMS and a wide range of complementary techniques, showing how the advantages of each technique can be combined to provide a fuller data set to address complex scientific questions. Techniques covered include optical microscopy, synchrotron techniques, including X-ray fluorescence and X-ray absorption spectroscopy, transmission electron microscopy, electron probe microanalysis, particle-induced X-ray emission and inductively coupled plasma mass spectrometry. Some of the challenges associated with sample preparation of plant material for SIMS analysis, the artefacts and limitations of the technique and future trends are also discussed.

  17. Analysis of small crack behavior for airframe applications

    NASA Technical Reports Server (NTRS)

    Mcclung, R. C.; Chan, K. S.; Hudak, S. J., Jr.; Davidson, D. L.

    1994-01-01

    The small fatigue crack problem is critically reviewed from the perspective of airframe applications. Different types of small cracks-microstructural, mechanical, and chemical-are carefully defined and relevant mechanisms identified. Appropriate analysis techniques, including both rigorous scientific and practical engineering treatments, are briefly described. Important materials data issues are addressed, including increased scatter in small crack data and recommended small crack test methods. Key problems requiring further study are highlighted.

  18. MEG-SIM: a web portal for testing MEG analysis methods using realistic simulated and empirical data.

    PubMed

    Aine, C J; Sanfratello, L; Ranken, D; Best, E; MacArthur, J A; Wallace, T; Gilliam, K; Donahue, C H; Montaño, R; Bryant, J E; Scott, A; Stephen, J M

    2012-04-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes ( http://cobre.mrn.org/megsim/ ). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis.

  19. MEG-SIM: A Web Portal for Testing MEG Analysis Methods using Realistic Simulated and Empirical Data

    PubMed Central

    Aine, C. J.; Sanfratello, L.; Ranken, D.; Best, E.; MacArthur, J. A.; Wallace, T.; Gilliam, K.; Donahue, C. H.; Montaño, R.; Bryant, J. E.; Scott, A.; Stephen, J. M.

    2012-01-01

    MEG and EEG measure electrophysiological activity in the brain with exquisite temporal resolution. Because of this unique strength relative to noninvasive hemodynamic-based measures (fMRI, PET), the complementary nature of hemodynamic and electrophysiological techniques is becoming more widely recognized (e.g., Human Connectome Project). However, the available analysis methods for solving the inverse problem for MEG and EEG have not been compared and standardized to the extent that they have for fMRI/PET. A number of factors, including the non-uniqueness of the solution to the inverse problem for MEG/EEG, have led to multiple analysis techniques which have not been tested on consistent datasets, making direct comparisons of techniques challenging (or impossible). Since each of the methods is known to have their own set of strengths and weaknesses, it would be beneficial to quantify them. Toward this end, we are announcing the establishment of a website containing an extensive series of realistic simulated data for testing purposes (http://cobre.mrn.org/megsim/). Here, we present: 1) a brief overview of the basic types of inverse procedures; 2) the rationale and description of the testbed created; and 3) cases emphasizing functional connectivity (e.g., oscillatory activity) suitable for a wide assortment of analyses including independent component analysis (ICA), Granger Causality/Directed transfer function, and single-trial analysis. PMID:22068921

  20. Geochemical Exploration Techniques Applicable in the Search for Copper Deposits

    USGS Publications Warehouse

    Chaffee, Maurice A.

    1975-01-01

    Geochemical exploration is an important part of copper-resource evaluation. A large number of geochemical exploration techniques, both proved and untried, are available to the geochemist to use in the search for new copper deposits. Analyses of whole-rock samples have been used in both regional and local geochemical exploration surveys in the search for copper. Analyses of mineral separates, such as biotite, magnetite, and sulfides, have also been used. Analyses of soil samples are widely used in geochemical exploration, especially for localized surveys. It is important to distinguish between residual and transported soil types. Orientation studies should always be conducted prior to a geochemical investigation in a given area in order to determine the best soil horizon and the best size of soil material for sampling in that area. Silty frost boils, caliche, and desert varnish are specialized types of soil samples that might be useful sampling media. Soil gas is a new and potentially valuable geochemical sampling medium, especially in exploring for buried mineral deposits in arid regions. Gaseous products in samples of soil may be related to base-metal deposits and include mercury vapor, sulfur dioxide, hydrogen sulfide, carbon oxysulfide, carbon dioxide, hydrogen, oxygen, nitrogen, the noble gases, the halogens, and many hydrocarbon compounds. Transported materials that have been used in geochemical sampling programs include glacial float boulders, glacial till, esker gravels, stream sediments, stream-sediment concentrates, and lake sediments. Stream-sediment sampling is probably the most widely used and most successful geochemical exploration technique. Hydrogeochemical exploration programs have utilized hot- and cold-spring waters and their precipitates as well as waters from lakes, streams, and wells. Organic gel found in lakes and at stream mouths is an unproved sampling medium. Suspended material and dissolved gases in any type of water may also be useful media. Samples of ice and snow have been used for limited geochemical surveys. Both geobotanical and biogeochemical surveys have been successful in locating copper deposits in many parts of the world. Micro-organisms, including bacteria and algae, are other unproved media that should be studied. Animals can be used in geochemical-prospecting programs. Dogs have been used quite successfully to sniff out hidden and exposed sulfide minerals. Tennite mounds are commonly composed of subsurface material, but have not as yet proved to be useful in locating buried mineral deposits. Animal tissue and waste products are essentially unproved but potentially valuable sampling media. Knowledge of the location of areas where trace-element-associated diseases in animals and man are endemic as well as a better understanding of these diseases, may aid in identifying regions that are enriched in or depleted of various elements, including copper. Results of analyses of gases in the atmosphere are proving valuable in mineral-exploration surveys. Studies involving metallic compounds exhaled by plants into the atmosphere, and of particulate matter suspended in the atmosphere are reviewed these methods may become important in the future. Remote-sensing techniques are useful for making indirect measurements of geochemical responses. Two techniques applicable to geochemical exploration are neutron-activation analysis and gamma-ray spectrometry. Aerial photography is especially useful in vegetation surveys. Radar imagery is an unproved but potentially valuable method for use in studies of vegetation in perpetually clouded regions. With the advent of modern computers, many new techniques, such as correlation analysis, regression analysis, discriminant analysis, factor analysis, cluster analysis, trend-surface analysis, and moving-average analysis can be applied to geochemical data sets. Selective use of these techniques can provide new insights into the interpretatio

  1. Using Single Drop Microextraction for Headspace Analysis with Gas Chromatography

    NASA Astrophysics Data System (ADS)

    Riccio, Daniel; Wood, Derrick C.; Miller, James M.

    2008-07-01

    Headspace (HS) gas chromatography (GC) is commonly used to analyze samples that contain non-volatiles. In 1996, a new sampling technique called single drop microextraction, SDME, was introduced, and in 2001 it was applied to HS analysis. It is a simple technique that uses equipment normally found in the undergraduate laboratory, making it ideal for instructional use, especially to illustrate HS analysis or as an alternative to solid-phase microextraction (SPME) to which it is very similar. The basic principles and practice of HS-GC using SDME are described, including a complete review of the literature. Some possible experiments are suggested using water and N -methylpyrrolidone (NMP) as solvents.

  2. Blade loss transient dynamics analysis, volume 1. Task 1: Survey and perspective. [aircraft gas turbine engines

    NASA Technical Reports Server (NTRS)

    Gallardo, V. C.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.

    1981-01-01

    An analytical technique was developed to predict the behavior of a rotor system subjected to sudden unbalance. The technique is implemented in the Turbine Engine Transient Rotor Analysis (TETRA) computer program using the component element method. The analysis was particularly aimed toward blade-loss phenomena in gas turbine engines. A dual-rotor, casing, and pylon structure can be modeled by the computer program. Blade tip rubs, Coriolis forces, and mechanical clearances are included. The analytical system was verified by modeling and simulating actual test conditions for a rig test as well as a full-engine, blade-release demonstration.

  3. Current process in hearing-aid fitting appointments: An analysis of audiologists' use of behaviour change techniques using the behaviour change technique taxonomy (v1).

    PubMed

    Barker, Fiona; Mackenzie, Emma; de Lusignan, Simon

    2016-11-01

    To observe and analyse the range and nature of behaviour change techniques (BCTs) employed by audiologists during hearing-aid fitting consultations to encourage and enable hearing-aid use. Non-participant observation and qualitative thematic analysis using the behaviour change technique taxonomy (version 1) (BCTTv1). Ten consultations across five English NHS audiology departments. Audiologists engage in behaviours to ensure the hearing-aid is fitted to prescription and is comfortable to wear. They provide information, equipment, and training in how to use a hearing-aid including changing batteries, cleaning, and maintenance. There is scope for audiologists to use additional BCTs: collaborating with patients to develop a behavioural plan for hearing-aid use that includes goal-setting, action-planning and problem-solving; involving significant others; providing information on the benefits of hearing-aid use or the consequences of non-use and giving advice about using prompts/cues for hearing-aid use. This observational study of audiologist behaviour in hearing-aid fitting consultations has identified opportunities to use additional behaviour change techniques that might encourage hearing-aid use. This information defines potential intervention targets for further research with the aim of improving hearing-aid use amongst adults with acquired hearing loss.

  4. An R package for the integrated analysis of metabolomics and spectral data.

    PubMed

    Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel

    2016-06-01

    Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. A Reference Model for Software and System Inspections. White Paper

    NASA Technical Reports Server (NTRS)

    He, Lulu; Shull, Forrest

    2009-01-01

    Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.

  6. A comparison between DART-MS and DSA-MS in the forensic analysis of writing inks.

    PubMed

    Drury, Nicholas; Ramotowski, Robert; Moini, Mehdi

    2018-05-23

    Ambient ionization mass spectrometry is gaining momentum in forensic science laboratories because of its high speed of analysis, minimal sample preparation, and information-rich results. One such application of ambient ionization methodology includes the analysis of writing inks from questioned documents where colorants of interest may not be soluble in common solvents, rendering thin layer chromatography (TLC) and separation-mass spectrometry methods such as LC/MS (-MS) impractical. Ambient ionization mass spectrometry uses a variety of ionization techniques such as penning ionization in Direct Analysis in Real Time (DART), and atmospheric pressure chemical ionization in Direct Sample Analysis (DSA), and electrospray ionization in Desorption Electrospray Ionization (DESI). In this manuscript, two of the commonly used ambient ionization techniques are compared: Perkin Elmer DSA-MS and IonSense DART in conjunction with a JEOL AccuTOF MS. Both technologies were equally successful in analyzing writing inks and produced similar spectra. DSA-MS produced less background signal likely because of its closed source configuration; however, the open source configuration of DART-MS provided more flexibility for sample positioning for optimum sensitivity and thereby allowing smaller piece of paper containing writing ink to be analyzed. Under these conditions, the minimum sample required for DART-MS was 1mm strokes of ink on paper, whereas DSA-MS required a minimum of 3mm. Moreover, both techniques showed comparable repeatability. Evaluation of the analytical figures of merit, including sensitivity, linear dynamic range, and repeatability, for DSA-MS and DART-MS analysis is provided. To the forensic context of the technique, DART-MS was applied to the analysis of United States Secret Service ink samples directly on a sampling mesh, and the results were compared with DSA-MS of the same inks on paper. Unlike analysis using separation mass spectrometry, which requires sample preparation, both DART-MS and DSA-MS successfully analyzed writing inks with minimal sample preparation. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Digital Mapping Techniques '07 - Workshop Proceedings

    USGS Publications Warehouse

    Soller, David R.

    2008-01-01

    The Digital Mapping Techniques '07 (DMT'07) workshop was attended by 85 technical experts from 49 agencies, universities, and private companies, including representatives from 27 state geological surveys. This year's meeting, the tenth in the annual series, was hosted by the South Carolina Geological Survey, from May 20-23, 2007, on the University of South Carolina campus in Columbia, South Carolina. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous year's meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized: 1) methods for creating and publishing map products (here, 'publishing' includes Web-based release); 2) field data capture software and techniques, including the use of LIDAR; 3) digital cartographic techniques; 4) migration of digital maps into ArcGIS Geodatabase format; 5) analytical GIS techniques; and 6) continued development of the National Geologic Map Database.

  8. Measuring masses of large biomolecules and bioparticles using mass spectrometric techniques.

    PubMed

    Peng, Wen-Ping; Chou, Szu-Wei; Patil, Avinash A

    2014-07-21

    Large biomolecules and bioparticles play a vital role in biology, chemistry, biomedical science and physics. Mass is a critical parameter for the characterization of large biomolecules and bioparticles. To achieve mass analysis, choosing a suitable ion source is the first step and the instruments for detecting ions, mass analyzers and detectors should also be considered. Abundant mass spectrometric techniques have been proposed to determine the masses of large biomolecules and bioparticles and these techniques can be divided into two categories. The first category measures the mass (or size) of intact particles, including single particle quadrupole ion trap mass spectrometry, cell mass spectrometry, charge detection mass spectrometry and differential mobility mass analysis; the second category aims to measure the mass and tandem mass of biomolecular ions, including quadrupole ion trap mass spectrometry, time-of-flight mass spectrometry, quadrupole orthogonal time-of-flight mass spectrometry and orbitrap mass spectrometry. Moreover, algorithms for the mass and stoichiometry assignment of electrospray mass spectra are developed to obtain accurate structure information and subunit combinations.

  9. Increasing Public Library Productivity.

    ERIC Educational Resources Information Center

    Samuelson, Howard

    1981-01-01

    Suggests ways of improving productivity for public libraries faced with increased accountability, dwindling revenues, and continuing inflation. Techniques described include work simplification, work analysis, improved management, and employee motivation. (RAA)

  10. Moving beyond Univariate Post-Hoc Testing in Exercise Science: A Primer on Descriptive Discriminate Analysis

    ERIC Educational Resources Information Center

    Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.

    2016-01-01

    There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…

  11. Using Geospatial Techniques to Address Institutional Objectives: St. Petersburg College Geo-Demographic Analysis. IR Applications, Volume 27

    ERIC Educational Resources Information Center

    Morris, Phillip; Thrall, Grant

    2010-01-01

    Geographic analysis has been adopted by businesses, especially the retail sector, since the early 1990s (Thrall, 2002). Institutional research can receive the same benefits businesses have by adopting geographic analysis and technology. The commonalities between businesses and higher education institutions include the existence of trade areas, the…

  12. NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1

    NASA Technical Reports Server (NTRS)

    Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)

    1990-01-01

    The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.

  13. Applying new seismic analysis techniques to the lunar seismic dataset: New information about the Moon and planetary seismology on the eve of InSight

    NASA Astrophysics Data System (ADS)

    Dimech, J. L.; Weber, R. C.; Knapmeyer-Endrun, B.; Arnold, R.; Savage, M. K.

    2016-12-01

    The field of planetary science is poised for a major advance with the upcoming InSight mission to Mars due to launch in May 2018. Seismic analysis techniques adapted for use on planetary data are therefore highly relevant to the field. The heart of this project is in the application of new seismic analysis techniques to the lunar seismic dataset to learn more about the Moon's crust and mantle structure, with particular emphasis on `deep' moonquakes which are situated half-way between the lunar surface and its core with no surface expression. Techniques proven to work on the Moon might also be beneficial for InSight and future planetary seismology missions which face similar technical challenges. The techniques include: (1) an event-detection and classification algorithm based on `Hidden Markov Models' to reclassify known moonquakes and look for new ones. Apollo 17 gravimeter and geophone data will also be included in this effort. (2) Measurements of anisotropy in the lunar mantle and crust using `shear-wave splitting'. Preliminary measurements on deep moonquakes using the MFAST program are encouraging, and continued evaluation may reveal new structural information on the Moon's mantle. (3) Probabilistic moonquake locations using NonLinLoc, a non-linear hypocenter location technique, using a modified version of the codes designed to work with the Moon's radius. Successful application may provide a new catalog of moonquake locations with rigorous uncertainty information, which would be a valuable input into: (4) new fault plane constraints from focal mechanisms using a novel approach to Bayes' theorem which factor in uncertainties in hypocenter coordinates and S-P amplitude ratios. Preliminary results, such as shear-wave splitting measurements, will be presented and discussed.

  14. Single-row, double-row, and transosseous equivalent techniques for isolated supraspinatus tendon tears with minimal atrophy: A retrospective comparative outcome and radiographic analysis at minimum 2-year followup

    PubMed Central

    McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.

    2014-01-01

    Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159

  15. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  16. Direct analysis of herbal powders by pipette-tip electrospray ionization mass spectrometry.

    PubMed

    Wang, Haixing; So, Pui-Kin; Yao, Zhong-Ping

    2014-01-27

    Conventional electrospray ionization mass spectrometry (ESI-MS) is widely used for analysis of solution samples. The development of solid-substrate ESI-MS allows direct ionization analysis of bulky solid samples. In this study, we developed pipette-tip ESI-MS, a technique that combines pipette tips with syringe and syringe pump, for direct analysis of herbal powders, another common form of samples. We demonstrated that various herbal powder samples, including herbal medicines and food samples, could be readily online extracted and analyzed using this technique. Various powder samples, such as Rhizoma coptidis, lotus plumule, great burdock achene, black pepper, Panax ginseng, roasted coffee beans, Fructus Schisandrae Chinensis and Fructus Schisandrae Sphenantherae, were analyzed using pipette-tip ESI-MS and quality mass spectra with stable and durable signals could be obtained. Both positive and negative ion modes were attempted and various compounds including amino acids, oligosaccharides, glycosides, alkaloids, organic acids, ginosensides, flavonoids and lignans could be detected. Principal component analysis (PCA) based on the acquired mass spectra allowed rapid differentiation of closely related herbal species. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Direct Surface and Droplet Microsampling for Electrospray Ionization Mass Spectrometry Analysis with an Integrated Dual-Probe Microfluidic Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Cong-Min; Zhu, Ying; Jin, Di-Qiong

    Ambient mass spectrometry (MS) has revolutionized the way of MS analysis and broadened its application in various fields. This paper describes the use of microfluidic techniques to simplify the setup and improve the functions of ambient MS by integrating the sampling probe, electrospray emitter probe, and online mixer on a single glass microchip. Two types of sampling probes, including a parallel-channel probe and a U-shaped channel probe, were designed for dryspot and liquid-phase droplet samples, respectively. We demonstrated that the microfabrication techniques not only enhanced the capability of ambient MS methods in analysis of dry-spot samples on various surfaces, butmore » also enabled new applications in the analysis of nanoliter-scale chemical reactions in an array of droplets. The versatility of the microchip-based ambient MS method was demonstrated in multiple different applications including evaluation of residual pesticide on fruit surfaces, sensitive analysis of low-ionizable analytes using postsampling derivatization, and high-throughput screening of Ugi-type multicomponent reactions.« less

  18. Characterization of electrically-active defects in ultraviolet light-emitting diodes with laser-based failure analysis techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.

    2016-01-14

    Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage ismore » not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less

  19. Sinus tarsi approach (STA) versus extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF): A meta-analysis.

    PubMed

    Bai, L; Hou, Y-L; Lin, G-H; Zhang, X; Liu, G-Q; Yu, B

    2018-04-01

    Our aim was to compare the effect of sinus tarsi approach (STA) vs extensile lateral approach (ELA) for treatment of closed displaced intra-articular calcaneal fractures (DIACF) is still being debated. A thorough research was carried out in the MEDLINE, EMBASE and Cochrane library databases from inception to December 2016. Only prospective or retrospective comparative studies were selected in this meta-analysis. Two independent reviewers conducted literature search, data extraction and quality assessment. The primary outcomes were anatomical restoration and prevalence of complications. Secondary outcomes included operation time and functional recovery. Four randomized controlled trials involving 326 patients and three cohort studies involving 206 patients were included. STA technique for DIACFs led to a decline in both operation time and incidence of complications. There were no significant differences between the groups in American Orthopedic Foot and Ankle Society scores, nor changes in Böhler angle. This meta-analysis suggests that STA technique may reduce the operation time and incidence of complications. In conclusion, STA technique is reasonably an optimal choice for DIACF. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  20. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  1. Review of Telemicrobiology

    PubMed Central

    Rhoads, Daniel D.; Mathison, Blaine A.; Bishop, Henry S.; da Silva, Alexandre J.; Pantanowitz, Liron

    2016-01-01

    Context Microbiology laboratories are continually pursuing means to improve quality, rapidity, and efficiency of specimen analysis in the face of limited resources. One means by which to achieve these improvements is through the remote analysis of digital images. Telemicrobiology enables the remote interpretation of images of microbiology specimens. To date, the practice of clinical telemicrobiology has not been thoroughly reviewed. Objective Identify the various methods that can be employed for telemicrobiology, including emerging technologies that may provide value to the clinical laboratory. Data Sources Peer-reviewed literature, conference proceedings, meeting presentations, and expert opinions pertaining to telemicrobiology have been evaluated. Results A number of modalities have been employed for telemicroscopy including static capture techniques, whole slide imaging, video telemicroscopy, mobile devices, and hybrid systems. Telemicrobiology has been successfully implemented for applications including routine primary diagnois, expert teleconsultation, and proficiency testing. Emerging areas include digital culture plate reading, mobile health applications and computer-augmented analysis of digital images. Conclusions Static image capture techniques to date have been the most widely used modality for telemicrobiology, despite the fact that other newer technologies are available and may produce better quality interpretations. Increased adoption of telemicrobiology offers added value, quality, and efficiency to the clinical microbiology laboratory. PMID:26317376

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year 1990. The ACL has four technical groups -- Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis. The Chemical Analysis Group uses wet-chemical and instrumental methods for elemental, compositional, and isotopic analyses of solid, liquid, and gaseous samples and provides specialized analytical services. The Instrumental Analysis Group uses nuclear counting techniques in radiochemical analyses over a wide range of sample types from low-level environmental samples to samples of high radioactivity. The Organic Analysis Group uses amore » number of complementary techniques to separate and to quantitatively and qualitatively analyze complex organic mixtures and compounds at the trace level, including synthetic fuels, toxic substances, fossil-fuel residues and emissions, pollutants, biologically active compounds, pesticides, and drugs. The Environmental Analysis Group performs analyses of inorganic environmental and hazardous waste and coal samples.« less

  3. A review of DTCA techniques: Appraising their success and potential impact on medication users.

    PubMed

    Babar, Zaheer-Ud-Din; Siraj, Ashna Medina; Curley, Louise

    2018-03-01

    Direct-to-consumer advertising (DTCA) has been present in some countries for nearly two decades. Its success and ramifications have been examined but not yet cataloged recently in a comprehensive manner. To review existing literature studies on the topic of DTCA techniques to provide an analysis of the current methods considered by drug marketers to enhance the effect of pharmaceutical product promotion and its success, as well as examine ramifications on the drug use process. A search of 7 electronic databases including MEDLINE and SCOPUS was conducted in December 2015, and updated until February 2016. A scientific review of literature (2008-2015) was performed to identify and collate information from relevant, peer reviewed original study articles investigating various DTCA techniques commonly employed in pharmaceutical promotion. A thematic analysis was undertaken to categorize categories of drug promotion, or techniques, and the saliency and impact of these. Nineteen original study articles were included in this review. All articles were based in the U.S. and New Zealand, where DTCA is legal. After reviewing all the articles, 4 themes with 11 subcategories were generated. These themes included disease mongering and medicalization, drug references, advertisement strategies and eDTCA. The themes describe different categories of techniques used to augment DTC advertisements to increase their impact and overall success in promoting a pharmaceutical product. Many DTCA techniques utilized by pharmaceutical marketers are beneficial to the success of DTC promotion of a drug. These techniques include the use of drug efficacy information, comparative claims, non-branded help seeking advertisements, formatted risks information, celebrity or expert endorsers and website trust factors. Through their use, public perception of the drug is made more favorable, increased attention is drawn to the advertisement, and the pharmaceutical product gains greater credibility and subsequent success in sales. However some techniques, although beneficial to pharmaceutical promotion, need to be monitored by policymakers and regulatory advisors, as they have the potential to negatively impact consumer health knowledge. Overall, through this review it is evident that there are a number if techniques that employed by pharmaceutical marketers to augment the success of pharmaceutical promotion. While these techniques may be beneficial to pharmaceutical companies and might increase awareness amongst consumers, it is important to be critical of them, as they have the potential to be exploited by pharmaceutical marketers. This review indicated that although some techniques are successful and appear to be satisfactory in providing information to consumers, other techniques need to be appraised more closely. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. On finite element implementation and computational techniques for constitutive modeling of high temperature composites

    NASA Technical Reports Server (NTRS)

    Saleeb, A. F.; Chang, T. Y. P.; Wilt, T.; Iskovitz, I.

    1989-01-01

    The research work performed during the past year on finite element implementation and computational techniques pertaining to high temperature composites is outlined. In the present research, two main issues are addressed: efficient geometric modeling of composite structures and expedient numerical integration techniques dealing with constitutive rate equations. In the first issue, mixed finite elements for modeling laminated plates and shells were examined in terms of numerical accuracy, locking property and computational efficiency. Element applications include (currently available) linearly elastic analysis and future extension to material nonlinearity for damage predictions and large deformations. On the material level, various integration methods to integrate nonlinear constitutive rate equations for finite element implementation were studied. These include explicit, implicit and automatic subincrementing schemes. In all cases, examples are included to illustrate the numerical characteristics of various methods that were considered.

  5. Image Analysis via Fuzzy-Reasoning Approach: Prototype Applications at NASA

    NASA Technical Reports Server (NTRS)

    Dominguez, Jesus A.; Klinko, Steven J.

    2004-01-01

    A set of imaging techniques based on Fuzzy Reasoning (FR) approach was built for NASA at Kennedy Space Center (KSC) to perform complex real-time visual-related safety prototype tasks, such as detection and tracking of moving Foreign Objects Debris (FOD) during the NASA Space Shuttle liftoff and visual anomaly detection on slidewires used in the emergency egress system for Space Shuttle at the launch pad. The system has also proved its prospective in enhancing X-ray images used to screen hard-covered items leading to a better visualization. The system capability was used as well during the imaging analysis of the Space Shuttle Columbia accident. These FR-based imaging techniques include novel proprietary adaptive image segmentation, image edge extraction, and image enhancement. Probabilistic Neural Network (PNN) scheme available from NeuroShell(TM) Classifier and optimized via Genetic Algorithm (GA) was also used along with this set of novel imaging techniques to add powerful learning and image classification capabilities. Prototype applications built using these techniques have received NASA Space Awards, including a Board Action Award, and are currently being filed for patents by NASA; they are being offered for commercialization through the Research Triangle Institute (RTI), an internationally recognized corporation in scientific research and technology development. Companies from different fields, including security, medical, text digitalization, and aerospace, are currently in the process of licensing these technologies from NASA.

  6. A study for hypergolic vapor sensor development

    NASA Technical Reports Server (NTRS)

    Stetter, J. R.

    1977-01-01

    The use of an electrochemical technique for MMH and N02 measurement was investigated. Specific MMH and N02 electrochemical sensors were developed. Experimental techniques for preparation, handling, and analysis of hydrazine's vapor mixtures at ppb and ppm levels were developed. Two approaches to N02 instrument design were evaluated including specific adsorption and specific electrochemical reduction. Two approaches to hydrazines monitoring were evaluated including catalytic conversion to N0 with subsequent N0 detection and direct specific electrochemical oxidation. Two engineering prototype MMH/N02 monitors were designed and constructed.

  7. A technique for the assessment of the visual impact of nearshore confined dredged materials and other built islands

    Treesearch

    Roy Mann

    1979-01-01

    Drilling rigs, confined dredged material disposal sites power and sewage treatment facilities, and other built objects on or near shorelines have often created appreciable impacts on the aesthetic perceptions of residents and recreational users. Techniques for assessing such impacts that are reviewed in this paper include viewscape analysis for large-scale shore...

  8. Proceedings of Damping 1993, volume 3

    NASA Astrophysics Data System (ADS)

    Portis, Bonnie L.

    1993-06-01

    Presented are individual papers of Damping '93, held 24-26 February 1993 in San Francisco. The subjects included: passive damping concepts; passive damping analysis and design techniques; optimization; damped control/structure interaction; viscoelastic material testing and characterization; highly damped materials; vibration suppression techniques; damping identification and dynamic testing; applications to aircraft; space structures; Marine structures; and commercial products; defense applications; and payoffs of vibration suppression.

  9. Proceedings of Damping 1993, volume 1

    NASA Astrophysics Data System (ADS)

    Portis, Bonnie L.

    1993-06-01

    Presented are individual papers of Damping '93 held 24-26 February, 1993, in San Francisco. The subjects included: passive damping concepts; passive damping analysis and design techniques; optimization; damped control/structure interaction; viscoelastic material testing and characterization; highly damped materials; vibration suppression techniques; damping identification and dynamic testing; application to aircraft; space structures; marine structures; commercial products; defense applications; and payoffs of vibration suppression.

  10. Recent trends in particle size analysis techniques

    NASA Technical Reports Server (NTRS)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  11. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  12. External rhinoplasty: a critical analysis of 500 cases.

    PubMed

    Foda, Hossam M T

    2003-06-01

    The study presents a comprehensive statistical analysis of a series of 500 consecutive rhinoplasties of which 380 (76 per cent) were primary and 120 (24 per cent) were secondary cases. All cases were operated upon using the external rhinoplasty technique; simultaneous septal surgery was performed in 350 (70 per cent) of the cases. Deformities of the upper two-thirds of the nose that occurred significantly more in the secondary cases included; dorsal saddling, dorsal irregularities, valve collapse, open roof and pollybeak deformities. In the lower third of the nose; secondary cases showed significantly higher incidences of depressed tip, tip over-rotation, tip asymmetry, retracted columella, and alar notching. Suturing techniques were used significantly more in primary cases, while in secondary cases grafting techniques were used significantly more. The complications encountered intra-operatively included; septal flap tears (2.8 per cent) and alar cartilage injury (1.8 per cent), while post-operative complications included; nasal trauma (one per cent), epistaxis (two per cent), infection (2.4 per cent), prolonged oedema (17 per cent), and nasal obstruction (0.8 per cent). The overall patient satisfaction rate was 95.6 per cent and the transcolumellar scar was found to be unacceptable in only 0.8 per cent of the patients.

  13. INVESTIGATION OF THE USE OF STATISTICS IN COUNSELING STUDENTS.

    ERIC Educational Resources Information Center

    HEWES, ROBERT F.

    THE OBJECTIVE WAS TO EMPLOY TECHNIQUES OF PROFILE ANALYSIS TO DEVELOP THE JOINT PROBABILITY OF SELECTING A SUITABLE SUBJECT MAJOR AND OF ASSURING TO A HIGH DEGREE GRADUATION FROM COLLEGE WITH THAT MAJOR. THE SAMPLE INCLUDED 1,197 MIT FRESHMEN STUDENTS IN 1952-53, AND THE VALIDATION GROUP INCLUDED 699 ENTRANTS IN 1954. DATA INCLUDED SECONDARY…

  14. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  15. MSC/NASTRAN DMAP Alter Used for Closed-Form Static Analysis With Inertia Relief and Displacement-Dependent Loads

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Solving for the displacements of free-free coupled systems acted upon by static loads is a common task in the aerospace industry. Often, these problems are solved by static analysis with inertia relief. This technique allows for a free-free static analysis by balancing the applied loads with the inertia loads generated by the applied loads. For some engineering applications, the displacements of the free-free coupled system induce additional static loads. Hence, the applied loads are equal to the original loads plus the displacement-dependent loads. A launch vehicle being acted upon by an aerodynamic loading can have such applied loads. The final displacements of such systems are commonly determined with iterative solution techniques. Unfortunately, these techniques can be time consuming and labor intensive. Because the coupled system equations for free-free systems with displacement-dependent loads can be written in closed form, it is advantageous to solve for the displacements in this manner. Implementing closed-form equations in static analysis with inertia relief is analogous to implementing transfer functions in dynamic analysis. An MSC/NASTRAN (MacNeal-Schwendler Corporation/NASA Structural Analysis) DMAP (Direct Matrix Abstraction Program) Alter was used to include displacement-dependent loads in static analysis with inertia relief. It efficiently solved a common aerospace problem that typically has been solved with an iterative technique.

  16. Separation techniques: Chromatography

    PubMed Central

    Coskun, Ozlem

    2016-01-01

    Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406

  17. Using dynamic mode decomposition for real-time background/foreground separation in video

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kutz, Jose Nathan; Grosek, Jacob; Brunton, Steven

    The technique of dynamic mode decomposition (DMD) is disclosed herein for the purpose of robustly separating video frames into background (low-rank) and foreground (sparse) components in real-time. Foreground/background separation is achieved at the computational cost of just one singular value decomposition (SVD) and one linear equation solve, thus producing results orders of magnitude faster than robust principal component analysis (RPCA). Additional techniques, including techniques for analyzing the video for multi-resolution time-scale components, and techniques for reusing computations to allow processing of streaming video in real time, are also described herein.

  18. Overview of Sparse Graph for Multiple Access in Future Mobile Networks

    NASA Astrophysics Data System (ADS)

    Lei, Jing; Li, Baoguo; Li, Erbao; Gong, Zhenghui

    2017-10-01

    Multiple access via sparse graph, such as low density signature (LDS) and sparse code multiple access (SCMA), is a promising technique for future wireless communications. This survey presents an overview of the developments in this burgeoning field, including transmitter structures, extrinsic information transform (EXIT) chart analysis and comparisons with existing multiple access techniques. Such technique enables multiple access under overloaded conditions to achieve a satisfactory performance. Message passing algorithm is utilized for multi-user detection in the receiver, and structures of the sparse graph are illustrated in detail. Outlooks and challenges of this technique are also presented.

  19. Development progress of the Materials Analysis and Particle Probe

    NASA Astrophysics Data System (ADS)

    Lucia, M.; Kaita, R.; Majeski, R.; Bedoya, F.; Allain, J. P.; Boyle, D. P.; Schmitt, J. C.; Onge, D. A. St.

    2014-11-01

    The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.

  20. Development progress of the Materials Analysis and Particle Probe.

    PubMed

    Lucia, M; Kaita, R; Majeski, R; Bedoya, F; Allain, J P; Boyle, D P; Schmitt, J C; Onge, D A St

    2014-11-01

    The Materials Analysis and Particle Probe (MAPP) is a compact in vacuo surface science diagnostic, designed to provide in situ surface characterization of plasma facing components in a tokamak environment. MAPP has been implemented for operation on the Lithium Tokamak Experiment at Princeton Plasma Physics Laboratory (PPPL), where all control and analysis systems are currently under development for full remote operation. Control systems include vacuum management, instrument power, and translational/rotational probe drive. Analysis systems include onboard Langmuir probes and all components required for x-ray photoelectron spectroscopy, low-energy ion scattering spectroscopy, direct recoil spectroscopy, and thermal desorption spectroscopy surface analysis techniques.

  1. Impact of platform switching on marginal peri-implant bone-level changes. A systematic review and meta-analysis

    PubMed Central

    Strietzel, Frank Peter; Neumann, Konrad; Hertel, Moritz

    2015-01-01

    Objective To address the focused question, is there an impact of platform switching (PS) on marginal bone level (MBL) changes around endosseous implants compared to implants with platform matching (PM) implant-abutment configurations? Material and methods A systematic literature search was conducted using electronic databases PubMed, Web of Science, Journals@Ovid Full Text and Embase, manual search for human randomized clinical trials (RCTs) and prospective clinical controlled cohort studies (PCCS) reporting on MBL changes at implants with PS-, compared with PM-implant-abutment connections, published between 2005 and June 2013. Results Twenty-two publications were eligible for the systematic review. The qualitative analysis of 15 RCTs and seven PCCS revealed more studies (13 RCTs and three PCCS) showing a significantly less mean marginal bone loss around implants with PS- compared to PM-implant-abutment connections, indicating a clear tendency favoring the PS technique. A meta-analysis including 13 RCTs revealed a significantly less mean MBL change (0.49 mm [CI95% 0.38; 0.60]) at PS implants, compared with PM implants (1.01 mm [CI95% 0.62; 1.40] (P < 0.0001). Conclusions The meta-analysis revealed a significantly less mean MBL change at implants with a PS compared to PM-implant-abutment configuration. Studies included herein showed an unclear as well as high risk of bias mostly, and relatively short follow-up periods. The qualitative analysis revealed a tendency favoring the PS technique to prevent or minimize peri-implant marginal bone loss compared with PM technique. Due to heterogeneity of the included studies, their results require cautious interpretation. PMID:24438506

  2. New approaches to wipe sampling methods for antineoplastic and other hazardous drugs in healthcare settings.

    PubMed

    Connor, Thomas H; Smith, Jerome P

    2016-09-01

    At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.

  3. Contemporary management of frontal sinus mucoceles: a meta-analysis.

    PubMed

    Courson, Andy M; Stankiewicz, James A; Lal, Devyani

    2014-02-01

    To analyze trends in the surgical management of frontal and fronto-ethmoid mucoceles through meta-analysis. Meta-analysis and case series. A systematic literature review on surgical management of frontal and fronto-ethmoid mucoceles was conducted. Studies were divided into historical (1975-2001) and contemporary (2002-2012) groups. A meta-analysis of these studies was performed. The historical and contemporary cohorts were compared (surgical approach, recurrence, and complications). To study evolution in surgical management, a senior surgeon's experience over 28 years was analyzed separately. Thirty-one studies were included for meta-analysis. The historical cohort included 425 mucoceles from 11 studies. The contemporary cohort included 542 mucoceles from 20 studies. More endoscopic techniques were used in the contemporary versus historical cohort (53.9% vs. 24.7%; P = <0.001). In the authors' series, a higher percentage was treated endoscopically (82.8% of 122 mucoceles). Recurrence (P = 0.20) and major complication (P = 0.23) rates were similar between cohorts. Minor complication rates were superior for endoscopic techniques in both cohorts (P = 0.02 historical; P = <0.001 contemporary). In the historical cohort, higher recurrence was noted in the external group (P = 0.03). Results from endoscopic and open approaches are comparable. Although endoscopic techniques are being increasingly adopted, comparison with our series shows that more cases could potentially be treated endoscopically. Frequent use of open approaches may reflect efficacy, or perhaps lack of expertise and equipment required for endoscopic management. Most contemporary authors favor endoscopic management, limiting open approaches for specific indications (unfavorable anatomy, lateral disease, and scarring). N/A. Copyright © 2013 The American Laryngological, Rhinological and Otological Society, Inc.

  4. An investigation of combustion and entropy noise

    NASA Technical Reports Server (NTRS)

    Strahle, W. C.

    1977-01-01

    The relative importance of entropy and direct combustion noise in turbopropulsion systems and the parameters upon which these noise sources depend were studied. Theory and experiment were employed to determine that at least with the apparatus used here, entropy noise can dominate combustion noise if there is a sufficient pressure gradient terminating the combustor. Measurements included combustor interior fluctuating pressure, near and far field fluctuating pressure, and combustor exit plane fluctuating temperatures, as well as mean pressures and temperatures. Analysis techniques included spectral, cross-correlation, cross power spectra, and ordinary and partial coherence analysis. Also conducted were combustor liner modification experiments to investigate the origin of the frequency content of combustion noise. Techniques were developed to extract nonpropagational pseudo-sound and the heat release fluctuation spectra from the data.

  5. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  6. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    ERIC Educational Resources Information Center

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  7. What School Financial Reports Reveal and Hide.

    ERIC Educational Resources Information Center

    Walters, Donald L.

    The problem of full disclosure of the financial operation and position of a school system is discussed in this paper. Techniques described for analyzing revenue and expenditure patterns include percentage changes and index numbers for horizontal analysis and proportions for vertical analysis. Also discussed are how financial reports are affected…

  8. Treating technology as a luxury? 10 necessary tools.

    PubMed

    Berger, Steven H

    2007-02-01

    Technology and techniques that every hospital should acquire and use for effective financial management include: Daily dashboards. Balanced scorecards. Benchmarking. Flexible budgeting and monitoring. Labor management systems. Nonlabor management analysis. Service, line, physician, and patient-level reporting and analysis. Cost accounting technology. Contract management technology. Denials management software.

  9. Analysis and Identification of Acid-Base Indicator Dyes by Thin-Layer Chromatography

    ERIC Educational Resources Information Center

    Clark, Daniel D.

    2007-01-01

    Thin-layer chromatography (TLC) is a very simple and effective technique that is used by chemists by different purposes, including the monitoring of the progress of a reaction. TLC can also be easily used for the analysis and identification of various acid-base indicator dyes.

  10. Optical analysis of crystal growth

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Passeur, Andrea; Harper, Sabrina

    1994-01-01

    Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.

  11. Highway Vehicle Retrofit Evaluation : Phase I. Analysis and Preliminary Evaluation Results. Volume 2. Sections 4 through 13 and Appendix.

    DOT National Transportation Integrated Search

    1975-11-01

    More than 20 representative classes of retrofit devices/concepts/techniques, including more than 130 specific items, were examined in the course of the study. A major portion of the analysis effort was directed to the evaluation of 16 advanced, novel...

  12. The NOAA Local Climate Analysis Tool - An Application in Support of a Weather Ready Nation

    NASA Astrophysics Data System (ADS)

    Timofeyeva, M. M.; Horsfall, F. M.

    2012-12-01

    Citizens across the U.S., including decision makers from the local to the national level, have a multitude of questions about climate, such as the current state and how that state fits into the historical context, and more importantly, how climate will impact them, especially with regard to linkages to extreme weather events. Developing answers to these types of questions for locations has typically required extensive work to gather data, conduct analyses, and generate relevant explanations and graphics. Too frequently providers don't have ready access to or knowledge of reliable, trusted data sets, nor sound, scientifically accepted analysis techniques such that they can provide a rapid response to queries they receive. In order to support National Weather Service (NWS) local office forecasters with information they need to deliver timely responses to climate-related questions from their customers, we have developed the Local Climate Analysis Tool (LCAT). LCAT uses the principles of artificial intelligence to respond to queries, in particular, through use of machine technology that responds intelligently to input from users. A user translates customer questions into primary variables and issues and LCAT pulls the most relevant data and analysis techniques to provide information back to the user, who in turn responds to their customer. Most responses take on the order of 10 seconds, which includes providing statistics, graphical displays of information, translations for users, metadata, and a summary of the user request to LCAT. Applications in Phase I of LCAT, which is targeted for the NWS field offices, include Climate Change Impacts, Climate Variability Impacts, Drought Analysis and Impacts, Water Resources Applications, Attribution of Extreme Events, and analysis techniques such as time series analysis, trend analysis, compositing, and correlation and regression techniques. Data accessed by LCAT are homogenized historical COOP and Climate Prediction Center climate division data available at NCDC. Applications for other NOAA offices and Federal agencies are currently being investigated, such as incorporation of tidal data, fish stocks, sea surface temperature, health-related data, and analyses relevant to those datasets. We will describe LCAT, its basic functionality, examples of analyses, and progress being made to provide the tool to a broader audience in support of ocean, fisheries, and health applications.

  13. Application of remote sensing to land and water resource planning: The Pocomoke River Basin, Maryland

    NASA Technical Reports Server (NTRS)

    Wildesen, S. E.; Phillips, E. P.

    1981-01-01

    Because of the size of the Pocomoke River Basin, the inaccessibility of certain areas, and study time constraints, several remote sensing techniques were used to collect base information on the river corridor, (a 23.2 km channel) and on a 1.2 km wooded floodplain. This information provided an adequate understanding of the environment and its resources, thus enabling effective management options to be designed. The remote sensing techniques used for assessment included manual analysis of high altitude color-infrared photography, computer-assisted analysis of LANDSAT-2 imagery, and the application of airborne oceanographic Lidar for topographic mapping. Results show that each techniques was valuable in providing the needed base data necessary for resource planning.

  14. Verification of Orthogrid Finite Element Modeling Techniques

    NASA Technical Reports Server (NTRS)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  15. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  16. Radar polarimetry - Analysis tools and applications

    NASA Technical Reports Server (NTRS)

    Evans, Diane L.; Farr, Tom G.; Van Zyl, Jakob J.; Zebker, Howard A.

    1988-01-01

    The authors have developed several techniques to analyze polarimetric radar data from the NASA/JPL airborne SAR for earth science applications. The techniques determine the heterogeneity of scatterers with subregions, optimize the return power from these areas, and identify probable scattering mechanisms for each pixel in a radar image. These techniques are applied to the discrimination and characterization of geologic surfaces and vegetation cover, and it is found that their utility varies depending on the terrain type. It is concluded that there are several classes of problems amenable to single-frequency polarimetric data analysis, including characterization of surface roughness and vegetation structure, and estimation of vegetation density. Polarimetric radar remote sensing can thus be a useful tool for monitoring a set of earth science parameters.

  17. Acoustic emission and nondestructive evaluation of biomaterials and tissues.

    PubMed

    Kohn, D H

    1995-01-01

    Acoustic emission (AE) is an acoustic wave generated by the release of energy from localized sources in a material subjected to an externally applied stimulus. This technique may be used nondestructively to analyze tissues, materials, and biomaterial/tissue interfaces. Applications of AE include use as an early warning tool for detecting tissue and material defects and incipient failure, monitoring damage progression, predicting failure, characterizing failure mechanisms, and serving as a tool to aid in understanding material properties and structure-function relations. All these applications may be performed in real time. This review discusses general principles of AE monitoring and the use of the technique in 3 areas of importance to biomedical engineering: (1) analysis of biomaterials, (2) analysis of tissues, and (3) analysis of tissue/biomaterial interfaces. Focus in these areas is on detection sensitivity, methods of signal analysis in both the time and frequency domains, the relationship between acoustic signals and microstructural phenomena, and the uses of the technique in establishing a relationship between signals and failure mechanisms.

  18. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  19. Design study for a high reliability five-year spacecraft tape transport

    NASA Technical Reports Server (NTRS)

    Benn, G. S. L.; Eshleman, R. L.

    1971-01-01

    Following the establishment of the overall transport concept, a study of all of the life limiting constraints associated with the transport were analyzed using modeling techniques. These design techniques included: (1) a response analysis from which the performance of the transport could be determined under operating conditions for a variety of conceptual variations both in a new and aged condition; (2) an analysis of a double cone guidance technique which yielded an optimum design for maximum guidance with minimum tape degradation; (3) an analysis of the tape pack design to eliminate spoking caused by negative tangential stress within the pack; (4) an evaluation of the stress levels experienced by the magnetic tape throughout the system; (5) a general review of the bearing and lubrication technology as applied to satellite recorders and hence the recommendation for using standard load carrying antifriction ball bearings; and (6) a kinetic analysis to determine the change in kinetic properties of the transport during operation.

  20. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  1. Recent trends in atomic fluorescence spectrometry towards miniaturized instrumentation-A review.

    PubMed

    Zou, Zhirong; Deng, Yujia; Hu, Jing; Jiang, Xiaoming; Hou, Xiandeng

    2018-08-17

    Atomic fluorescence spectrometry (AFS), as one of the common atomic spectrometric techniques with high sensitivity, simple instrumentation, and low acquisition and running cost, has been widely used in various fields for trace elemental analysis, notably the determination of hydride-forming elements by hydride generation atomic fluorescence spectrometry (HG-AFS). In recent years, the soaring demand of field analysis has significantly promoted the miniaturization of analytical atomic spectrometers or at least instrumental components. Various techniques have also been developed to approach the goal of portable/miniaturized AFS instrumentation for field analysis. In this review, potentially portable/miniaturized AFS techniques, primarily involving advanced instrumental components and whole instrumentation with references since 2000, are summarized and discussed. The discussion mainly includes five aspects: radiation source, atomizer, detector, sample introduction, and miniaturized atomic fluorescence spectrometer/system. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Enzyme-based electrochemical biosensors for determination of organophosphorus and carbamate pesticides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Everett, W.R.; Rechnitz, G.A.

    1999-01-01

    A mini review of enzyme-based electrochemical biosensors for inhibition analysis of organophosphorus and carbamate pesticides is presented. Discussion includes the most recent literature to present advances in detection limits, selectivity and real sample analysis. Recent reviews on the monitoring of pesticides and their residues suggest that the classical analytical techniques of gas and liquid chromatography are the most widely used methods of detection. These techniques, although very accurate in their determinations, can be quite time consuming and expensive and usually require extensive sample clean up and pro-concentration. For these and many other reasons, the classical techniques are very difficult tomore » adapt for field use. Numerous researchers, in the past decade, have developed and made improvements on biosensors for use in pesticide analysis. This mini review will focus on recent advances made in enzyme-based electrochemical biosensors for the determinations of organophosphorus and carbamate pesticides.« less

  3. Nonlinear earthquake analysis of reinforced concrete frames with fiber and Bernoulli-Euler beam-column element.

    PubMed

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched.

  4. Multivariate analysis of progressive thermal desorption coupled gas chromatography-mass spectrometry.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel

    Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that varymore » as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief that this multivariate analysis will enable superior differentiation capabilities. In addition, noise and system artifacts challenge the analysis of GC-MS data collected on lower cost equipment, ubiquitous in commercial laboratories. This research has the potential to affect many areas of analytical chemistry including materials analysis, medical testing, and environmental surveillance. It could also provide a method to measure adsorption parameters for chemical interactions on various surfaces by measuring desorption as a function of temperature for mixtures. We have presented results of a novel method for examining offgas products of a common PDMS material. Our method involves utilizing a stepped TD/GC-MS data acquisition scheme that may be almost totally automated, coupled with multivariate analysis schemes. This method of data generation and analysis can be applied to a number of materials aging and thermal degradation studies.« less

  5. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  6. Organization Domain Modeling. Volume 1. Conceptual Foundations, Process and Workproduct Description

    DTIC Science & Technology

    1993-07-31

    J.A. Hess, W.E. Novak, and A.S. Peterson. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report CMU/SEI-90-TR-21, Software...domain analysis (DA) and modeling, including a structured set of workproducts, a tailorable process model and a set of modeling techniques and guidelines...23 5.3.1 U sability Analysis (Rescoping) ..................................................... 24

  7. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  8. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  9. Recent Advances in Techniques for Starch Esters and the Applications: A Review

    PubMed Central

    Hong, Jing; Zeng, Xin-An; Brennan, Charles S.; Brennan, Margaret; Han, Zhong

    2016-01-01

    Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented. PMID:28231145

  10. High-resolution measurements of the multilayer ultra-structure of articular cartilage and their translational potential

    PubMed Central

    2014-01-01

    Current musculoskeletal imaging techniques usually target the macro-morphology of articular cartilage or use histological analysis. These techniques are able to reveal advanced osteoarthritic changes in articular cartilage but fail to give detailed information to distinguish early osteoarthritis from healthy cartilage, and this necessitates high-resolution imaging techniques measuring cells and the extracellular matrix within the multilayer structure of articular cartilage. This review provides a comprehensive exploration of the cellular components and extracellular matrix of articular cartilage as well as high-resolution imaging techniques, including magnetic resonance image, electron microscopy, confocal laser scanning microscopy, second harmonic generation microscopy, and laser scanning confocal arthroscopy, in the measurement of multilayer ultra-structures of articular cartilage. This review also provides an overview for micro-structural analysis of the main components of normal or osteoarthritic cartilage and discusses the potential and challenges associated with developing non-invasive high-resolution imaging techniques for both research and clinical diagnosis of early to late osteoarthritis. PMID:24946278

  11. Microextraction techniques combined with capillary electrophoresis in bioanalysis.

    PubMed

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-01-01

    Over the past two decades, many environmentally sustainable sample-preparation techniques have been proposed, with the objective of reducing the use of toxic organic solvents or substituting these with environmentally friendly alternatives. Microextraction techniques (MEs), in which only a small amount of organic solvent is used, have several advantages, including reduced sample volume, analysis time, and operating costs. Thus, MEs are well adapted in bioanalysis, in which sample preparation is mandatory because of the complexity of a sample that is available in small quantities (mL or even μL only). Capillary electrophoresis (CE) is a powerful and efficient separation technique in which no organic solvents are required for analysis. Combination of CE with MEs is regarded as a very attractive environmentally sustainable analytical tool, and numerous applications have been reported over the last few decades for bioanalysis of low-molecular-weight compounds or for peptide analysis. In this paper we review the use of MEs combined with CE in bioanalysis. The review is divided into two sections: liquid and solid-based MEs. A brief practical and theoretical description of each ME is given, and the techniques are illustrated by relevant applications.

  12. Practical semen analysis: from A to Z

    PubMed Central

    Brazil, Charlene

    2010-01-01

    Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076

  13. Nonlinear analysis of structures. [within framework of finite element method

    NASA Technical Reports Server (NTRS)

    Armen, H., Jr.; Levine, H.; Pifko, A.; Levy, A.

    1974-01-01

    The development of nonlinear analysis techniques within the framework of the finite-element method is reported. Although the emphasis is concerned with those nonlinearities associated with material behavior, a general treatment of geometric nonlinearity, alone or in combination with plasticity is included, and applications presented for a class of problems categorized as axisymmetric shells of revolution. The scope of the nonlinear analysis capabilities includes: (1) a membrane stress analysis, (2) bending and membrane stress analysis, (3) analysis of thick and thin axisymmetric bodies of revolution, (4) a general three dimensional analysis, and (5) analysis of laminated composites. Applications of the methods are made to a number of sample structures. Correlation with available analytic or experimental data range from good to excellent.

  14. Seeing through the Glitz: Commercial Literacy for Students.

    ERIC Educational Resources Information Center

    Hillyer, Kathryn Oliver

    Television advertising aimed at children is explored, including its regulation, techniques, and research on its effects. Particular attention is given to sexual stereotypes in commercials, including an analysis of certain commercials. A commercial literacy unit is presented for use with fourth graders. The history of advertising targeted at…

  15. The Time-Out and Seclusion Continuum: A Systematic Analysis of the Case Law

    ERIC Educational Resources Information Center

    Bon, Susan C.; Zirkel, Perry A.

    2014-01-01

    During the past two decades, scholars, educators, and special interest organizations, including advocacy groups, have critically examined and debated the ethical and legal use of aversive interventions with individuals with disabilities. These interventions comprise a broad spectrum of behavior management techniques including but not at all…

  16. A selected bibliography: Remote sensing applications in agriculture

    USGS Publications Warehouse

    Draeger, William C.; McClelland, David T.

    1977-01-01

    The bibliography contains nearly 300 citations of selected publications and technical reports dealing with the application of remote-sensing techniques to the collection and analysis of agricultural information. Most of the items included were published between January 1968 and December 1975, although some earlier works of continuing interest are included.

  17. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  18. Novel CE-MS technique for detection of high explosives using perfluorooctanoic acid as a MEKC and mass spectrometric complexation reagent.

    PubMed

    Brensinger, Karen; Rollman, Christopher; Copper, Christine; Genzman, Ashton; Rine, Jacqueline; Lurie, Ira; Moini, Mehdi

    2016-01-01

    To address the need for the forensic analysis of high explosives, a novel capillary electrophoresis mass spectrometry (CE-MS) technique has been developed for high resolution, sensitivity, and mass accuracy detection of these compounds. The technique uses perfluorooctanoic acid (PFOA) as both a micellar electrokinetic chromatography (MEKC) reagent for separation of neutral explosives and as the complexation reagent for mass spectrometric detection of PFOA-explosive complexes in the negative ion mode. High explosives that formed complexes with PFOA included RDX, HMX, tetryl, and PETN. Some nitroaromatics were detected as molecular ions. Detection limits in the high parts per billion range and linear calibration responses over two orders of magnitude were obtained. For proof of concept, the technique was applied to the quantitative analysis of high explosives in sand samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Robotic nephroureterectomy: a simplified approach requiring no patient repositioning or robot redocking.

    PubMed

    Zargar, Homayoun; Krishnan, Jayram; Autorino, Riccardo; Akca, Oktay; Brandao, Luis Felipe; Laydner, Humberto; Samarasekera, Dinesh; Ko, Oliver; Haber, Georges-Pascal; Kaouk, Jihad H; Stein, Robert J

    2014-10-01

    Robotic technology is increasingly adopted in urologic surgery and a variety of techniques has been described for minimally invasive treatment of upper tract urothelial cancer (UTUC). To describe a simplified surgical technique of robot-assisted nephroureterectomy (RANU) and to report our single-center surgical outcomes. Patients with history of UTUC treated with this modality between April 2010 and August 2013 were included in the analysis. Institutional review board approval was obtained. Informed consent was signed by all patients. A simplified single-step RANU not requiring repositioning or robot redocking. Lymph node dissection was performed selectively. Descriptive analysis of patients' characteristics, perioperative outcomes, histopathology, and short-term follow-up data was performed. The analysis included 31 patients (mean age: 72.4±10.6 yr; mean body mass index: 26.6±5.1kg/m(2)). Twenty-six of 30 tumors (86%) were high grade. Mean tumor size was 3.1±1.8cm. Of the 31 patients, 13 (42%) had pT3 stage disease. One periureteric positive margin was noted in a patient with bulky T3 disease. The mean number of lymph nodes removed was 9.4 (standard deviation: 5.6; range: 3-21). Two of 14 patients (14%) had positive lymph nodes on final histology. No patients required a blood transfusion. Six patients experienced complications postoperatively, with only one being a high grade (Clavien 3b) complication. Median hospital stay was 5 d. Within the follow-up period, seven patients experienced bladder recurrences and four patients developed metastatic disease. Our RANU technique eliminates the need for patient repositioning or robot redocking. This technique can be safely reproduced, with surgical outcomes comparable to other established techniques. We describe a surgical technique using the da Vinci robot for a minimally invasive treatment of patients presenting with upper tract urothelial cancer. This technique can be safely implemented with good surgical outcomes. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  20. Remote sensing science for the Nineties; Proceedings of IGARSS '90 - 10th Annual International Geoscience and Remote Sensing Symposium, University of Maryland, College Park, May 20-24, 1990. Vols. 1, 2, & 3

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Various papers on remote sensing (RS) for the nineties are presented. The general topics addressed include: subsurface methods, radar scattering, oceanography, microwave models, atmospheric correction, passive microwave systems, RS in tropical forests, moderate resolution land analysis, SAR geometry and SNR improvement, image analysis, inversion and signal processing for geoscience, surface scattering, rain measurements, sensor calibration, wind measurements, terrestrial ecology, agriculture, geometric registration, subsurface sediment geology, radar modulation mechanisms, radar ocean scattering, SAR calibration, airborne radar systems, water vapor retrieval, forest ecosystem dynamics, land analysis, multisensor data fusion. Also considered are: geologic RS, RS sensor optical measurements, RS of snow, temperature retrieval, vegetation structure, global change, artificial intelligence, SAR processing techniques, geologic RS field experiment, stochastic modeling, topography and Digital Elevation model, SAR ocean waves, spaceborne lidar and optical, sea ice field measurements, millimeter waves, advanced spectroscopy, spatial analysis and data compression, SAR polarimetry techniques. Also discussed are: plant canopy modeling, optical RS techniques, optical and IR oceanography, soil moisture, sea ice back scattering, lightning cloud measurements, spatial textural analysis, SAR systems and techniques, active microwave sensing, lidar and optical, radar scatterometry, RS of estuaries, vegetation modeling, RS systems, EOS/SAR Alaska, applications for developing countries, SAR speckle and texture.

  1. Efficient morse decompositions of vector fields.

    PubMed

    Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene

    2008-01-01

    Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.

  2. Influence of bicortical techniques in internal connection placed in premaxillary area by 3D finite element analysis.

    PubMed

    Verri, Fellippo Ramos; Cruz, Ronaldo Silva; Lemos, Cleidiel Aparecido Araújo; de Souza Batista, Victor Eduardo; Almeida, Daniel Augusto Faria; Verri, Ana Caroline Gonçales; Pellizzer, Eduardo Piza

    2017-02-01

    The aim of study was to evaluate the stress distribution in implant-supported prostheses and peri-implant bone using internal hexagon (IH) implants in the premaxillary area, varying surgical techniques (conventional, bicortical and bicortical in association with nasal floor elevation), and loading directions (0°, 30° and 60°) by three-dimensional (3D) finite element analysis. Three models were designed with Invesalius, Rhinoceros 3D and Solidworks software. Each model contained a bone block of the premaxillary area including an implant (IH, Ø4 × 10 mm) supporting a metal-ceramic crown. 178 N was applied in different inclinations (0°, 30°, 60°). The results were analyzed by von Mises, maximum principal stress, microstrain and displacement maps including ANOVA statistical test for some situations. Von Mises maps of implant, screws and abutment showed increase of stress concentration as increased loading inclination. Bicortical techniques showed reduction in implant apical area and in the head of fixation screws. Bicortical techniques showed slight increase stress in cortical bone in the maximum principal stress and microstrain maps under 60° loading. No differences in bone tissue regarding surgical techniques were observed. As conclusion, non-axial loads increased stress concentration in all maps. Bicortical techniques showed lower stress for implant and screw; however, there was slightly higher stress on cortical bone only under loads of higher inclinations (60°).

  3. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, A; Samost, A; Viswanathan, A

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios weremore » then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in brachytherapy and may prove to be an alternative to other hazard analysis techniques.« less

  4. Systemic Review of Anatomic Single- Versus Double-Bundle Anterior Cruciate Ligament Reconstruction: Does Femoral Tunnel Drilling Technique Matter?

    PubMed

    Zhang, Yang; Xu, Caiqi; Dong, Shiqui; Shen, Peng; Su, Wei; Zhao, Jinzhong

    2016-09-01

    To provide an up-to-date assessment of the difference between anatomic double-bundle anterior cruciate ligament (ACL) reconstruction (DB-ACLR) and anatomic single-bundle ACL reconstruction (SB-ACLR). We hypothesized that anatomic SB-ACLR using independent femoral drilling technique would be able to achieve kinematic stability as with anatomic DB-ACLR. A comprehensive Internet search was performed to identify all therapeutic trials of anatomic DB-ACLR versus anatomic SB-ACLR. Only clinical studies of Level I and II evidence were included. The comparative outcomes were instrument-measured anterior laxity, Lachman test, pivot shift, clinical outcomes including objective/subjective International Knee Documentation Committee (IKDC) score, Lysholm score, Tegner activity scale and complication rates of extension/flexion deficits, graft failure, and early osteoarthritis. Subgroup analyses were performed for femoral tunnel drilling techniques including independent drilling and transtibial (TT) drilling. Twenty-two clinical trials of 2,261 anatomically ACL-reconstructed patients were included in the meta-analysis. Via TT drilling technique, anatomic DB-ACLR led to improved instrument-measured anterior laxity with a standard mean difference (SMD) of -0.42 (95% confidence interval [CI] = -0.81 to -0.02), less rotational instability measured by pivot shift (SMD = 2.76, 95% CI = 1.24 to 6.16), and higher objective IKDC score with odds ratio (OR) of 2.28 (95% CI = 1.19 to 4.36). Via independent drilling technique, anatomic DB-ACLR yielded better pivot shift (SMD = 2.04, 95% CI = 1.36 to 3.05). Anatomic DB-ACLR also revealed statistical significance in subjective IKDC score compared with anatomic SB-ACLR (SMD = 0.27, 95% CI = 0.05 to 0.49). Anatomic DB-ACLR showed better anterior and rotational stability and higher objective IKDC score than anatomic SB-ACLR via TT drilling technique. Via independent drilling technique, however, anatomic DB-ACLR only showed superiority of rotational stability. All clinical function outcomes except subjective IKDC score were not significantly different between anatomic DB-ACLR and SB-ACLR. Level II, meta-analysis of Level I and II studies. Copyright © 2016 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  5. Determining the Population Size of Pond Phytoplankton.

    ERIC Educational Resources Information Center

    Hummer, Paul J.

    1980-01-01

    Discusses methods for determining the population size of pond phytoplankton, including water sampling techniques, laboratory analysis of samples, and additional studies worthy of investigation in class or as individual projects. (CS)

  6. Older driver highway design handbook

    DOT National Transportation Integrated Search

    1998-01-01

    This project included literature reviews and research syntheses, using meta-analytic techniques where : appropriate, in the areas of age-related (diminished) functional capabilities, and human factors and : highway safety. A User-Requirements Analysi...

  7. Splash evaluation of SRB designs

    NASA Technical Reports Server (NTRS)

    Counter, D. N.

    1974-01-01

    A technique is developed to optimize the shuttle solid rocket booster (SRB) design for water impact loads. The SRB is dropped by parachute and recovered at sea for reuse. Loads experienced at water impact are design critical. The probability of each water impact load is determined using a Monte Carlo technique and an aerodynamic analysis of the SRB parachute system. Meteorological effects are included and four configurations are evaluated.

  8. A Radial Basis Function Approach to Financial Time Series Analysis

    DTIC Science & Technology

    1993-12-01

    including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes

  9. Proceedings of Damping 1993, volume 2

    NASA Astrophysics Data System (ADS)

    Portis, Bonnie L.

    1993-06-01

    Presented are individual papers of Damping '93, held 24-26 Feb. 1993 in San Francisco. The subjects included the following: passive damping concepts; passive damping analysis and design techniques; optimization; damped control/structure interaction; viscoelastic material testing and characterization; highly damped materials; vibration suppression techniques; damping identification and dynamic testing; applications to aircraft; space structures; marine structures; and commercial products; defense applications; and payoffs of vibration suppression.

  10. PASCO: Structural panel analysis and sizing code: Users manual - Revised

    NASA Technical Reports Server (NTRS)

    Anderson, M. S.; Stroud, W. J.; Durling, B. J.; Hennessy, K. W.

    1981-01-01

    A computer code denoted PASCO is described for analyzing and sizing uniaxially stiffened composite panels. Buckling and vibration analyses are carried out with a linked plate analysis computer code denoted VIPASA, which is included in PASCO. Sizing is based on nonlinear mathematical programming techniques and employs a computer code denoted CONMIN, also included in PASCO. Design requirements considered are initial buckling, material strength, stiffness and vibration frequency. A user's manual for PASCO is presented.

  11. Reliability of diagnosis and clinical efficacy of visceral osteopathy: a systematic review.

    PubMed

    Guillaud, Albin; Darbois, Nelly; Monvoisin, Richard; Pinsault, Nicolas

    2018-02-17

    In 2010, the World Health Organization published benchmarks for training in osteopathy in which osteopathic visceral techniques are included. The purpose of this study was to identify and critically appraise the scientific literature concerning the reliability of diagnosis and the clinical efficacy of techniques used in visceral osteopathy. Databases MEDLINE, OSTMED.DR, the Cochrane Library, Osteopathic Research Web, Google Scholar, Journal of American Osteopathic Association (JAOA) website, International Journal of Osteopathic Medicine (IJOM) website, and the catalog of Académie d'ostéopathie de France website were searched through December 2017. Only inter-rater reliability studies including at least two raters or the intra-rater reliability studies including at least two assessments by the same rater were included. For efficacy studies, only randomized-controlled-trials (RCT) or crossover studies on unhealthy subjects (any condition, duration and outcome) were included. Risk of bias was determined using a modified version of the quality appraisal tool for studies of diagnostic reliability (QAREL) in reliability studies. For the efficacy studies, the Cochrane risk of bias tool was used to assess their methodological design. Two authors performed data extraction and analysis. Eight reliability studies and six efficacy studies were included. The analysis of reliability studies shows that the diagnostic techniques used in visceral osteopathy are unreliable. Regarding efficacy studies, the least biased study shows no significant difference for the main outcome. The main risks of bias found in the included studies were due to the absence of blinding of the examiners, an unsuitable statistical method or an absence of primary study outcome. The results of the systematic review lead us to conclude that well-conducted and sound evidence on the reliability and the efficacy of techniques in visceral osteopathy is absent. The review is registered PROSPERO 12th of December 2016. Registration number is CRD4201605286 .

  12. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    NASA Technical Reports Server (NTRS)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  13. Advanced numerical technique for analysis of surface and bulk acoustic waves in resonators using periodic metal gratings

    NASA Astrophysics Data System (ADS)

    Naumenko, Natalya F.

    2014-09-01

    A numerical technique characterized by a unified approach for the analysis of different types of acoustic waves utilized in resonators in which a periodic metal grating is used for excitation and reflection of such waves is described. The combination of the Finite Element Method analysis of the electrode domain with the Spectral Domain Analysis (SDA) applied to the adjacent upper and lower semi-infinite regions, which may be multilayered and include air as a special case of a dielectric material, enables rigorous simulation of the admittance in resonators using surface acoustic waves, Love waves, plate modes including Lamb waves, Stonely waves, and other waves propagating along the interface between two media, and waves with transient structure between the mentioned types. The matrix formalism with improved convergence incorporated into SDA provides fast and robust simulation for multilayered structures with arbitrary thickness of each layer. The described technique is illustrated by a few examples of its application to various combinations of LiNbO3, isotropic silicon dioxide and silicon with a periodic array of Cu electrodes. The wave characteristics extracted from the admittance functions change continuously with the variation of the film and plate thicknesses over wide ranges, even when the wave nature changes. The transformation of the wave nature with the variation of the layer thicknesses is illustrated by diagrams and contour plots of the displacements calculated at resonant frequencies.

  14. Clinical benefit using sperm hyaluronic acid binding technique in ICSI cycles: a systematic review and meta-analysis.

    PubMed

    Beck-Fruchter, Ronit; Shalev, Eliezer; Weiss, Amir

    2016-03-01

    The human oocyte is surrounded by hyaluronic acid, which acts as a natural selector of spermatozoa. Human sperm that express hyaluronic acid receptors and bind to hyaluronic acid have normal shape, minimal DNA fragmentation and low frequency of chromosomal aneuploidies. Use of hyaluronic acid binding assays in intracytoplasmic sperm injection (ICSI) cycles to improve clinical outcomes has been studied, although none of these studies had sufficient statistical power. In this systematic review and meta-analysis, electronic databases were searched up to June 2015 to identify studies of ICSI cycles in which spermatozoa able to bind hyaluronic acid was selected. The main outcomes were fertilization rate and clinical pregnancy rate. Secondary outcomes included cleavage rate, embryo quality, implantation rate, spontaneous abortion and live birth rate. Seven studies and 1437 cycles were included. Use of hyaluronic acid binding sperm selection technique yielded no improvement in fertilization and pregnancy rates. A meta-analysis of all available studies showed an improvement in embryo quality and implantation rate; an analysis of prospective studies only showed an improvement in embryo quality. Evidence does not support routine use of hyaluronic acid binding assays in all ICSI cycles. Identification of patients that might benefit from this technique needs further study. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  15. Evaluation and study of advanced optical contamination, deposition, measurement, and removal techniques. [including computer programs and ultraviolet reflection analysis

    NASA Technical Reports Server (NTRS)

    Linford, R. M. F.; Allen, T. H.; Dillow, C. F.

    1975-01-01

    A program is described to design, fabricate and install an experimental work chamber assembly (WCA) to provide a wide range of experimental capability. The WCA incorporates several techniques for studying the kinetics of contaminant films and their effect on optical surfaces. It incorporates the capability for depositing both optical and contaminant films on temperature-controlled samples, and for in-situ measurements of the vacuum ultraviolet reflectance. Ellipsometer optics are mounted on the chamber for film thickness determinations, and other features include access ports for radiation sources and instrumentation. Several supporting studies were conducted to define specific chamber requirements, to determine the sensitivity of the measurement techniques to be incorporated in the chamber, and to establish procedures for handling samples prior to their installation in the chamber. A bibliography and literature survey of contamination-related articles is included.

  16. Multivariate Analysis of Seismic Field Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alam, M. Kathleen

    1999-06-01

    This report includes the details of the model building procedure and prediction of seismic field data. Principal Components Regression, a multivariate analysis technique, was used to model seismic data collected as two pieces of equipment were cycled on and off. Models built that included only the two pieces of equipment of interest had trouble predicting data containing signals not included in the model. Evidence for poor predictions came from the prediction curves as well as spectral F-ratio plots. Once the extraneous signals were included in the model, predictions improved dramatically. While Principal Components Regression performed well for the present datamore » sets, the present data analysis suggests further work will be needed to develop more robust modeling methods as the data become more complex.« less

  17. Surgical Techniques for the Reconstruction of Medial Collateral Ligament and Posteromedial Corner Injuries of the Knee: A Systematic Review.

    PubMed

    DeLong, Jeffrey M; Waterman, Brian R

    2015-11-01

    To systematically review reconstruction techniques of the medial collateral ligament (MCL) and associated medial structures of the knee (e.g., posterior oblique ligament). A systematic review of Medline/PubMed Database (1966 to November 2013), reference list scanning and citation searches of included articles, and manual searches of high-impact journals (2000 to July 2013) and conference proceedings (2009 to July 2013) were performed to identify publications describing MCL reconstruction techniques of the knee. Exclusion criteria included (1) MCL primary repair techniques or advancement procedures, (2) lack of clear description of MCL reconstruction technique, (3) animal models, (4) nonrelevant study design, (5) and foreign language articles without available translation. After review of 4,600 references, 25 publications with 359 of 388 patients (92.5%) were isolated for analysis, including 18 single-bundle MCL and 10 double-bundle reconstruction techniques. Only 2 techniques were classified as anatomic reconstructions, and clinical and objective outcomes (n = 28; 100% <3 mm side-to-side difference [SSD]) were superior to those with nonanatomic reconstruction (n = 182; 79.1% <3 mm SSD) and tendon transfer techniques (n = 114; 52.6% <3 mm SSD). This systematic review demonstrated that numerous medial reconstruction techniques have been used in the treatment of isolated and combined medial knee injuries in the existent literature. Many variations exist among reconstruction techniques and may differ by graft choices, method of fixation, number of bundles, tensioning protocol, and degree of anatomic restoration of medial and posteromedial corner knee restraints. Further studies are required to better ascertain the comparative clinical outcomes with anatomic, non-anatomic, and tendon transfer techniques for medial knee reconstruction. Level IV, systematic review of level IV studies and surgical techniques. Published by Elsevier Inc.

  18. Further SEASAT SAR coastal ocean wave analysis

    NASA Technical Reports Server (NTRS)

    Kasischke, E. S.; Shuchman, R. A.; Meadows, G. A.; Jackson, P. L.; Tseng, Y.

    1981-01-01

    Analysis techniques used to exploit SEASAT synthetic aperture radar (SAR) data of gravity waves are discussed and the SEASAT SAR's ability to monitor large scale variations in gravity wave fields in both deep and shallow water is evaluated. The SAR analysis techniques investigated included motion compensation adjustments and the semicausal model for spectral analysis of SAR wave data. It was determined that spectra generated from fast Fourier transform analysis (FFT) of SAR wave data were not significantly altered when either range telerotation adjustments or azimuth focus shifts were used during processing of the SAR signal histories, indicating that SEASAT imagery of gravity waves is not significantly improved or degraded by motion compensation adjustments. Evaluation of the semicausal (SC) model using SEASAT SAR data from Rev. 974 indicates that the SC spectral estimates were not significantly better than the FFT results.

  19. Induction of lucid dreams: a systematic review of evidence.

    PubMed

    Stumbrys, Tadas; Erlacher, Daniel; Schädlich, Melanie; Schredl, Michael

    2012-09-01

    In lucid dreams the dreamer is aware of dreaming and often able to influence the ongoing dream content. Lucid dreaming is a learnable skill and a variety of techniques is suggested for lucid dreaming induction. This systematic review evaluated the evidence for the effectiveness of induction techniques. A comprehensive literature search was carried out in biomedical databases and specific resources. Thirty-five studies were included in the analysis (11 sleep laboratory and 24 field studies), of which 26 employed cognitive techniques, 11 external stimulation and one drug application. The methodological quality of the included studies was relatively low. None of the induction techniques were verified to induce lucid dreams reliably and consistently, although some of them look promising. On the basis of the reviewed studies, a taxonomy of lucid dream induction methods is presented. Several methodological issues are discussed and further directions for future studies are proposed. Copyright © 2012 Elsevier Inc. All rights reserved.

  20. Considerations and techniques for incorporating remotely sensed imagery into the land resource management process.

    NASA Technical Reports Server (NTRS)

    Brooner, W. G.; Nichols, D. A.

    1972-01-01

    Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.

  1. Optimal wavelength selection for noncontact reflection photoplethysmography

    NASA Astrophysics Data System (ADS)

    Corral Martinez, Luis F.; Paez, Gonzalo; Strojnik, Marija

    2011-08-01

    In this work, we obtain backscattered signals from human forehead for wavelengths from 380 to 980 nm. The results reveal bands with strong pulsatile signals that carry useful information. We describe those bands as the most suitable wavelengths in the visible and NIR regions from which heart and respiratory rate parameters can be derived using long distance non-contact reflection photoplethysmography analysis. The latter results show the feasibility of a novel technique for remotely detection of vital signs in humans. This technique, which may include morphological analysis or maps of tissue oxygenation, is a further step to real non-invasive remote monitoring of patients.

  2. Cluster analysis and quality assessment of logged water at an irrigation project, eastern Saudi Arabia.

    PubMed

    Hussain, Mahbub; Ahmed, Syed Munaf; Abderrahman, Walid

    2008-01-01

    A multivariate statistical technique, cluster analysis, was used to assess the logged surface water quality at an irrigation project at Al-Fadhley, Eastern Province, Saudi Arabia. The principal idea behind using the technique was to utilize all available hydrochemical variables in the quality assessment including trace elements and other ions which are not considered in conventional techniques for water quality assessments like Stiff and Piper diagrams. Furthermore, the area belongs to an irrigation project where water contamination associated with the use of fertilizers, insecticides and pesticides is expected. This quality assessment study was carried out on a total of 34 surface/logged water samples. To gain a greater insight in terms of the seasonal variation of water quality, 17 samples were collected from both summer and winter seasons. The collected samples were analyzed for a total of 23 water quality parameters including pH, TDS, conductivity, alkalinity, sulfate, chloride, bicarbonate, nitrate, phosphate, bromide, fluoride, calcium, magnesium, sodium, potassium, arsenic, boron, copper, cobalt, iron, lithium, manganese, molybdenum, nickel, selenium, mercury and zinc. Cluster analysis in both Q and R modes was used. Q-mode analysis resulted in three distinct water types for both the summer and winter seasons. Q-mode analysis also showed the spatial as well as temporal variation in water quality. R-mode cluster analysis led to the conclusion that there are two major sources of contamination for the surface/shallow groundwater in the area: fertilizers, micronutrients, pesticides, and insecticides used in agricultural activities, and non-point natural sources.

  3. Improved motors for utility applications: Volume 6, Squirrel-cage rotor analysis: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Griffith, J.W.; McCoy, R.M.

    1986-11-01

    An analysis of squirrel cage induction motor rotors was undertaken in response to an Industry Assessment Study finding 10% of motor failures to be rotor related. The analysis focuses on evaluating rotor design life. The evaluation combines state-of-the-art electromagnetic, thermal, and structural solution techniques into an integrated analysis and presents a simple summary. Finite element techniques are central tools in the analysis. The analysis is applied to a specific forced draft fan drive design. Fans as a category of application have a higher failure rate than other categories of power station auxiliary motor applications. Forced-draft fan drives are one ofmore » the major fan drives which accelerate a relatively high value of rotor load inertia. Various starting and operating conditions are studied for this forced-draft fan drive motor including a representative application duty cycle.« less

  4. Modification and evaluation of a Barnes-type objective analysis scheme for surface meteorological data

    NASA Technical Reports Server (NTRS)

    Smith, D. R.

    1982-01-01

    The Purdue Regional Objective Analysis of the Mesoscale (PROAM) is a Barness-type scheme for the analysis of surface meteorological data. Modifications are introduced to the original version in order to increase its flexibility and to permit greater ease of usage. The code was rewritten for an interactive computer environment. Furthermore, a multiple iteration technique suggested by Barnes was implemented for greater accuracy. PROAM was subjected to a series of experiments in order to evaluate its performance under a variety of analysis conditions. The tests include use of a known analytic temperature distribution in order to quantify error bounds for the scheme. Similar experiments were conducted using actual atmospheric data. Results indicate that the multiple iteration technique increases the accuracy of the analysis. Furthermore, the tests verify appropriate values for the analysis parameters in resolving meso-beta scale phenomena.

  5. Digital Mapping Techniques '08—Workshop Proceedings, Moscow, Idaho, May 18–21, 2008

    USGS Publications Warehouse

    Soller, David R.

    2009-01-01

    The Digital Mapping Techniques '08 (DMT'08) workshop was attended by more than 100 technical experts from 40 agencies, universities, and private companies, including representatives from 24 State geological surveys. This year's meeting, the twelfth in the annual series, was hosted by the Idaho Geological Survey, from May 18-21, 2008, on the University of Idaho campus in Moscow, Idaho. Each DMT workshop has been coordinated by the U.S. Geological Survey's National Geologic Map Database Project and the Association of American State Geologists (AASG). As in previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.

  6. Archer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atzeni, Simone; Ahn, Dong; Gopalakrishnan, Ganesh

    2017-01-12

    Archer is built on top of the LLVM/Clang compilers that support OpenMP. It applies static and dynamic analysis techniques to detect data races in OpenMP programs generating a very low runtime and memory overhead. Static analyses identify data race free OpenMP regions and exclude them from runtime analysis, which is performed by ThreadSanitizer included in LLVM/Clang.

  7. Static Analysis of Programming Exercises: Fairness, Usefulness and a Method for Application

    ERIC Educational Resources Information Center

    Nutbrown, Stephen; Higgins, Colin

    2016-01-01

    This article explores the suitability of static analysis techniques based on the abstract syntax tree (AST) for the automated assessment of early/mid degree level programming. Focus is on fairness, timeliness and consistency of grades and feedback. Following investigation into manual marking practises, including a survey of markers, the assessment…

  8. Integrating Elemental Analysis and Chromatography Techniques by Analyzing Metal Oxide and Organic UV Absorbers in Commercial Sunscreens

    ERIC Educational Resources Information Center

    Quin~ones, Rosalynn; Bayline, Jennifer Logan; Polvani, Deborah A.; Neff, David; Westfall, Tamara D.; Hijazi, Abdullah

    2016-01-01

    A series of undergraduate laboratory experiments that utilize reversed-phase HPLC separation, inductively coupled plasma spectroscopy (ICP), and scanning electron microscopy with energy dispersive spectroscopy (SEM-EDS) are described for the analysis of commercial sunscreens. The active ingredients of many sunscreen brands include zinc or titanium…

  9. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  10. A Selected Annotated Bibliography on the Analysis of Water Resource Systems.

    ERIC Educational Resources Information Center

    Gysi, Marshall; And Others

    Presented is an annotated bibliography of some selected publications pertaining to the application of systems analysis techniques to water resource problems. The majority of the references included in this bibliography have been published within the last five years. About half of the entries have informative abstracts and keywords following the…

  11. Market Analysis. What Is It? How Does It Fit into Comprehensive Institutional Planning?

    ERIC Educational Resources Information Center

    Groff, Warren

    The basic principles of market analysis are examined in this paper especially as they relate to institutional planning. Introductory material presents background information, including: (1) a description of two projects undertaken to implement modern management techniques at small colleges; (2) an examination of three marketing philosophies; and…

  12. Set of new draft methods for the analysis of organic disinfection by-products, including 551 and 552. Draft report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-01-01

    The set of documents discusses the new draft methods (EPA method 551, EPA method 552) for the analysis of disinfection byproducts contained in drinking water. The methods use the techniques of liquid/liquid extraction and gas chromatography with electron capture detection.

  13. Toward exploratory analysis of diversity unified across fields of study: an information visualization approach

    Treesearch

    Tuan Pham; Julia Jones; Ronald Metoyer; Frederick Colwell

    2014-01-01

    The study of the diversity of multivariate objects shares common characteristics and goals across disciplines, including ecology and organizational management. Nevertheless, subject-matter experts have adopted somewhat separate diversity concepts and analysis techniques, limiting the potential for sharing and comparing across disciplines. Moreover, while large and...

  14. Current trends in sample preparation for cosmetic analysis.

    PubMed

    Zhong, Zhixiong; Li, Gongke

    2017-01-01

    The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. An image analysis of TLC patterns for quality control of saffron based on soil salinity effect: A strategy for data (pre)-processing.

    PubMed

    Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar

    2018-01-15

    Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Glyphosate analysis using sensors and electromigration separation techniques as alternatives to gas or liquid chromatography.

    PubMed

    Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin

    2018-01-01

    Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.

  17. Review of Congenital Mitral Valve Stenosis: Analysis, Repair Techniques and Outcomes.

    PubMed

    Baird, Christopher W; Marx, Gerald R; Borisuk, Michele; Emani, Sitram; del Nido, Pedro J

    2015-06-01

    The spectrum of congenital mitral valve stenosis (MS) consists of a complex of defects that result in obstruction to left ventricular inflow. This spectrum includes patients with underdeveloped left heart structures (Fig. 1) to those with isolated congenital MS. The specific mitral valve defects can further be divided into categories based on the relationship to the mitral valve annulus including valvar, supravalvar and subvalvar components. Clinically, these patients present based on the degree of obstruction, associated mitral regurgitation, secondary pulmonary hypertension, associated lung disease and/or associated cardiac lesions. There are a number of factors that contribute to the successful outcomes in these patients including pre-operative imaging, aggressive surgical techniques and peri-operative management.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.

    The application of statistics to environmental pollution monitoring studies requires a knowledge of statistical analysis methods particularly well suited to pollution data. This book fills that need by providing sampling plans, statistical tests, parameter estimation procedure techniques, and references to pertinent publications. Most of the statistical techniques are relatively simple, and examples, exercises, and case studies are provided to illustrate procedures. The book is logically divided into three parts. Chapters 1, 2, and 3 are introductory chapters. Chapters 4 through 10 discuss field sampling designs and Chapters 11 through 18 deal with a broad range of statistical analysis procedures. Somemore » statistical techniques given here are not commonly seen in statistics book. For example, see methods for handling correlated data (Sections 4.5 and 11.12), for detecting hot spots (Chapter 10), and for estimating a confidence interval for the mean of a lognormal distribution (Section 13.2). Also, Appendix B lists a computer code that estimates and tests for trends over time at one or more monitoring stations using nonparametric methods (Chapters 16 and 17). Unfortunately, some important topics could not be included because of their complexity and the need to limit the length of the book. For example, only brief mention could be made of time series analysis using Box-Jenkins methods and of kriging techniques for estimating spatial and spatial-time patterns of pollution, although multiple references on these topics are provided. Also, no discussion of methods for assessing risks from environmental pollution could be included.« less

  19. Characterization of Metal Powders Used for Additive Manufacturing.

    PubMed

    Slotwinski, J A; Garboczi, E J; Stutzman, P E; Ferraris, C F; Watson, S S; Peltz, M A

    2014-01-01

    Additive manufacturing (AM) techniques can produce complex, high-value metal parts, with potential applications as critical parts, such as those found in aerospace components. The production of AM parts with consistent and predictable properties requires input materials (e.g., metal powders) with known and repeatable characteristics, which in turn requires standardized measurement methods for powder properties. First, based on our previous work, we assess the applicability of current standardized methods for powder characterization for metal AM powders. Then we present the results of systematic studies carried out on two different powder materials used for additive manufacturing: stainless steel and cobalt-chrome. The characterization of these powders is important in NIST efforts to develop appropriate measurements and standards for additive materials and to document the property of powders used in a NIST-led additive manufacturing material round robin. An extensive array of characterization techniques was applied to these two powders, in both virgin and recycled states. The physical techniques included laser diffraction particle size analysis, X-ray computed tomography for size and shape analysis, and optical and scanning electron microscopy. Techniques sensitive to structure and chemistry, including X-ray diffraction, energy dispersive analytical X-ray analysis using the X-rays generated during scanning electron microscopy, and X-Ray photoelectron spectroscopy were also employed. The results of these analyses show how virgin powder changes after being exposed to and recycled from one or more Direct Metal Laser Sintering (DMLS) additive manufacturing build cycles. In addition, these findings can give insight into the actual additive manufacturing process.

  20. A Novel Microcharacterization Technique in the Measurement of Strain and Orientation Gradient in Advanced Materials

    NASA Technical Reports Server (NTRS)

    Garmestai, H.; Harris, K.; Lourenco, L.

    1997-01-01

    Representation of morphology and evolution of the microstructure during processing and their relation to properties requires proper experimental techniques. Residual strains, lattice distortion, and texture (micro-texture) at the interface and the matrix of a layered structure or a functionally gradient material and their variation are among parameters important in materials characterization but hard to measure with present experimental techniques. Current techniques available to measure changes in interred material parameters (residual stress, micro-texture, microplasticity) produce results which are either qualitative or unreliable. This problem becomes even more complicated in the case of a temperature variation. These parameters affect many of the mechanical properties of advanced materials including stress-strain relation, ductility, creep, and fatigue. A review of some novel experimental techniques using recent advances in electron microscopy is presented here to measure internal stress, (micro)texture, interracial strength and (sub)grain formation and realignment. Two of these techniques are combined in the chamber of an Environmental Scanning Electron Microscope to measure strain and orientation gradients in advanced materials. These techniques which include Backscattered Kikuchi Diffractometry (BKD) and Microscopic Strain Field Analysis are used to characterize metallic and intermetallic matrix composites and superplastic materials. These techniques are compared with the more conventional x-ray diffraction and indentation techniques.

  1. Plain Language to Communicate Physical Activity Information: A Website Content Analysis.

    PubMed

    Paige, Samantha R; Black, David R; Mattson, Marifran; Coster, Daniel C; Stellefson, Michael

    2018-04-01

    Plain language techniques are health literacy universal precautions intended to enhance health care system navigation and health outcomes. Physical activity (PA) is a popular topic on the Internet, yet it is unknown if information is communicated in plain language. This study examined how plain language techniques are included in PA websites, and if the use of plain language techniques varies according to search procedures (keyword, search engine) and website host source (government, commercial, educational/organizational). Three keywords ("physical activity," "fitness," and "exercise") were independently entered into three search engines (Google, Bing, and Yahoo) to locate a nonprobability sample of websites ( N = 61). Fourteen plain language techniques were coded within each website to examine content formatting, clarity and conciseness, and multimedia use. Approximately half ( M = 6.59; SD = 1.68) of the plain language techniques were included in each website. Keyword physical activity resulted in websites with fewer clear and concise plain language techniques ( p < .05), whereas fitness resulted in websites with more clear and concise techniques ( p < .01). Plain language techniques did not vary by search engine or the website host source. Accessing PA information that is easy to understand and behaviorally oriented may remain a challenge for users. Transdisciplinary collaborations are needed to optimize plain language techniques while communicating online PA information.

  2. Quantitative analysis of ribosome–mRNA complexes at different translation stages

    PubMed Central

    Shirokikh, Nikolay E.; Alkalaeva, Elena Z.; Vassilenko, Konstantin S.; Afonina, Zhanna A.; Alekhina, Olga M.; Kisselev, Lev L.; Spirin, Alexander S.

    2010-01-01

    Inhibition of primer extension by ribosome–mRNA complexes (toeprinting) is a proven and powerful technique for studying mechanisms of mRNA translation. Here we have assayed an advanced toeprinting approach that employs fluorescently labeled DNA primers, followed by capillary electrophoresis utilizing standard instruments for sequencing and fragment analysis. We demonstrate that this improved technique is not merely fast and cost-effective, but also brings the primer extension inhibition method up to the next level. The electrophoretic pattern of the primer extension reaction can be characterized with a precision unattainable by the common toeprint analysis utilizing radioactive isotopes. This method allows us to detect and quantify stable ribosomal complexes at all stages of translation, including initiation, elongation and termination, generated during the complete translation process in both the in vitro reconstituted translation system and the cell lysate. We also point out the unique advantages of this new methodology, including the ability to assay sites of the ribosomal complex assembly on several mRNA species in the same reaction mixture. PMID:19910372

  3. An introduction to autonomous control systems

    NASA Technical Reports Server (NTRS)

    Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.

    1991-01-01

    The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.

  4. Fluorescence fluctuation spectroscopy for clinical applications

    NASA Astrophysics Data System (ADS)

    Olson, Eben

    Fluorescence correlation spectroscopy (FCS) and the related techniques of brightness analysis have become standard tools in biological and biophysical research. By analyzing the statistics of fluorescence emitted from a restricted volume, a number of parameters including concentrations, diffusion coefficients and chemical reaction rates can be determined. The single-molecule sensitivity, spectral selectivity, small sample volume and non-perturbative measurement mechanism of FCS make it an excellent technique for the study of molecular interactions. However, its adoption outside of the research laboratory has been limited. Potential reasons for this include the cost and complexity of the required apparatus. In this work, the application of fluorescence fluctuation analysis to several clinical problems is considered. Optical designs for FCS instruments which reduce the cost and increase alignment tolerance are presented. Brightness analysis of heterogenous systems, with application to the characterization of protein aggregates and multimer distributions, is considered. Methods for FCS-based assays of two clinically relevant proteins, von Willebrand factor and haptoglobin, are presented as well.

  5. Optimization Techniques for Clustering,Connectivity, and Flow Problems in Complex Networks

    DTIC Science & Technology

    2012-10-01

    discrete optimization and for analysis of performance of algorithm portfolios; introducing a metaheuristic framework of variable objective search that...The results of empirical evaluation of the proposed algorithm are also included. 1.3 Theoretical analysis of heuristics and designing new metaheuristic ...analysis of heuristics for inapproximable problems and designing new metaheuristic approaches for the problems of interest; (IV) Developing new models

  6. Advanced study of global oceanographic requirements for EOS A/B: Appendix volume

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Tables and graphs are presented for a review of oceanographic studies using satellite-borne instruments. The topics considered include sensor requirements, error analysis for wind determination from glitter pattern measurements, coverage frequency plots, ground station rise and set times, a technique for reduction and analysis of ocean spectral data, rationale for the selection of a 2 PM descending orbit, and a priority analysis.

  7. Comprehensive Flood Plain Studies Using Spatial Data Management Techniques.

    DTIC Science & Technology

    1978-06-01

    Hydrologic Engineer- ing Center computer programs that forecast urban storm water quality and dynamic in- stream water quality response to waste...determination. Water Quality The water quality analysis planned for the pilot study includes urban storm water quality forecasting and in-streamn...analysis is performed under the direction of Tony Thomas. Chief, Research Branch, by Jess Abbott for storm water quality analysis, R. G. Willey for

  8. An Introduction to Benefit-Cost Analysis for Evaluating Public Expenditure Alternatives. Learning Packages in the Policy Sciences, PS-22.

    ERIC Educational Resources Information Center

    LaPlante, Josephine M.; Durham, Taylor R.

    A revised edition of PS-14, "An Introduction to Benefit-Cost Analysis for Evaluating Public Programs," presents concepts and techniques of benefit-cost analysis as tools that can be used to assist in deciding between alternatives. The goals of the new edition include teaching students to think about the possible benefits and costs of each…

  9. Flight flutter testing technology at Grumman. [automated telemetry station for on line data reduction

    NASA Technical Reports Server (NTRS)

    Perangelo, H. J.; Milordi, F. W.

    1976-01-01

    Analysis techniques used in the automated telemetry station (ATS) for on line data reduction are encompassed in a broad range of software programs. Concepts that form the basis for the algorithms used are mathematically described. The control the user has in interfacing with various on line programs is discussed. The various programs are applied to an analysis of flight data which includes unimodal and bimodal response signals excited via a swept frequency shaker and/or random aerodynamic forces. A nonlinear response error modeling analysis approach is described. Preliminary results in the analysis of a hard spring nonlinear resonant system are also included.

  10. What If They Just Want To Write?

    ERIC Educational Resources Information Center

    Gilmar, Sybil

    1979-01-01

    Writing workshops are held for gifted students (7 to 15 years old) and include journalism, guidebook, and fiction work with critical analysis of each other's writing. Sample exercises and brainstorming techniques are discussed. (CL)

  11. A Marketing Case History Profile

    ERIC Educational Resources Information Center

    Weirick, Margaret C.

    1978-01-01

    A current marketing plan from Temple University illustrates many marketing techniques, including those dealing with enrollment objectives, market objectives, demographic characteristics of Temple students, market share analysis, and the marketing plan. Specific guidelines are provided. (LBH)

  12. Determination of Sulfur in Fuel Oils: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Graham, Richard C.; And Others

    1982-01-01

    Chromatographic techniques are used in conjunction with a Parr oxygen combustion bomb to determine sulfur in fuel oils. Experimental procedures and results are discussed including an emphasis on safety considerations. (SK)

  13. General Analytical Schemes for the Characterization of Pectin-Based Edible Gelled Systems

    PubMed Central

    Haghighi, Maryam; Rezaei, Karamatollah

    2012-01-01

    Pectin-based gelled systems have gained increasing attention for the design of newly developed food products. For this reason, the characterization of such formulas is a necessity in order to present scientific data and to introduce an appropriate finished product to the industry. Various analytical techniques are available for the evaluation of the systems formulated on the basis of pectin and the designed gel. In this paper, general analytical approaches for the characterization of pectin-based gelled systems were categorized into several subsections including physicochemical analysis, visual observation, textural/rheological measurement, microstructural image characterization, and psychorheological evaluation. Three-dimensional trials to assess correlations among microstructure, texture, and taste were also discussed. Practical examples of advanced objective techniques including experimental setups for small and large deformation rheological measurements and microstructural image analysis were presented in more details. PMID:22645484

  14. Image analysis software for following progression of peripheral neuropathy

    NASA Astrophysics Data System (ADS)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  15. Noise analysis for CCD-based ultraviolet and visible spectrophotometry.

    PubMed

    Davenport, John J; Hodgkinson, Jane; Saffell, John R; Tatam, Ralph P

    2015-09-20

    We present the results of a detailed analysis of the noise behavior of two CCD spectrometers in common use, an AvaSpec-3648 CCD UV spectrometer and an Ocean Optics S2000 Vis spectrometer. Light sources used include a deuterium UV/Vis lamp and UV and visible LEDs. Common noise phenomena include source fluctuation noise, photoresponse nonuniformity, dark current noise, fixed pattern noise, and read noise. These were identified and characterized by varying light source, spectrometer settings, or temperature. A number of noise-limiting techniques are proposed, demonstrating a best-case spectroscopic noise equivalent absorbance of 3.5×10(-4)  AU for the AvaSpec-3648 and 5.6×10(-4)  AU for the Ocean Optics S2000 over a 30 s integration period. These techniques can be used on other CCD spectrometers to optimize performance.

  16. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  17. Including the effect of motion artifacts in noise and performance analysis of dual-energy contrast-enhanced mammography

    NASA Astrophysics Data System (ADS)

    Allec, N.; Abbaszadeh, S.; Scott, C. C.; Lewin, J. M.; Karim, K. S.

    2012-12-01

    In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.

  18. Including the effect of motion artifacts in noise and performance analysis of dual-energy contrast-enhanced mammography.

    PubMed

    Allec, N; Abbaszadeh, S; Scott, C C; Lewin, J M; Karim, K S

    2012-12-21

    In contrast-enhanced mammography (CEM), the dual-energy dual-exposure technique, which can leverage existing conventional mammography infrastructure, relies on acquiring the low- and high-energy images using two separate exposures. The finite time between image acquisition leads to motion artifacts in the combined image. Motion artifacts can lead to greater anatomical noise in the combined image due to increased mismatch of the background tissue in the images to be combined, however the impact has not yet been quantified. In this study we investigate a method to include motion artifacts in the dual-energy noise and performance analysis. The motion artifacts are included via an extended cascaded systems model. To validate the model, noise power spectra of a previous dual-energy clinical study are compared to that of the model. The ideal observer detectability is used to quantify the effect of motion artifacts on tumor detectability. It was found that the detectability can be significantly degraded when motion is present (e.g., detectability of 2.5 mm radius tumor decreased by approximately a factor of 2 for translation motion on the order of 1000 μm). The method presented may be used for a more comprehensive theoretical noise and performance analysis and fairer theoretical performance comparison between dual-exposure techniques, where motion artifacts are present, and single-exposure techniques, where low- and high-energy images are acquired simultaneously and motion artifacts are absent.

  19. Higher-Order Spectral Analysis of F-18 Flight Flutter Data

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Dunn, Shane

    2005-01-01

    Royal Australian Air Force (RAAF) F/A-18 flight flutter test data is presented and analyzed using various techniques. The data includes high-quality measurements of forced responses and limit cycle oscillation (LCO) phenomena. Standard correlation and power spectral density (PSD) techniques are applied to the data and presented. Novel applications of experimentally-identified impulse responses and higher-order spectral techniques are also applied to the data and presented. The goal of this research is to develop methods that can identify the onset of nonlinear aeroelastic phenomena, such as LCO, during flutter testing.

  20. Calibrating the ChemCam LIBS for Carbonate Minerals on Mars

    DOE R&D Accomplishments Database

    Wiens, Roger C.; Clegg, Samuel M.; Ollila, Ann M.; Barefield, James E.; Lanza, Nina; Newsom, Horton E.

    2009-01-01

    The ChemCam instrument suite on board the NASA Mars Science Laboratory (MSL) rover includes the first LIBS instrument for extraterrestrial applications. Here we examine carbonate minerals in a simulated martian environment using the LIDS technique in order to better understand the in situ signature of these materials on Mars. Both chemical composition and rock type are determined using multivariate analysis (MVA) techniques. Composition is confirmed using scanning electron microscopy (SEM) techniques. Our initial results suggest that ChemCam can recognize and differentiate between carbonate materials on Mars.

  1. Frontiers of Two-Dimensional Correlation Spectroscopy. Part 1. New concepts and noteworthy developments

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.

  2. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  3. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  4. Evaluation of marginal and internal gaps of metal ceramic crowns obtained from conventional impressions and casting techniques with those obtained from digital techniques.

    PubMed

    Rai, Rathika; Kumar, S Arun; Prabhu, R; Govindan, Ranjani Thillai; Tanveer, Faiz Mohamed

    2017-01-01

    Accuracy in fit of cast metal restoration has always remained as one of the primary factors in determining the success of the restoration. A well-fitting restoration needs to be accurate both along its margin and with regard to its internal surface. The aim of the study is to evaluate the marginal fit of metal ceramic crowns obtained by conventional inlay casting wax pattern using conventional impression with the metal ceramic crowns obtained by computer-aided design and computer-aided manufacturing (CAD/CAM) technique using direct and indirect optical scanning. This in vitro study on preformed custom-made stainless steel models with former assembly that resembles prepared tooth surfaces of standardized dimensions comprised three groups: the first group included ten samples of metal ceramic crowns fabricated with conventional technique, the second group included CAD/CAM-milled direct metal laser sintering (DMLS) crowns using indirect scanning, and the third group included DMLS crowns fabricated by direct scanning of the stainless steel model. The vertical marginal gap and the internal gap were evaluated with the stereomicroscope (Zoomstar 4); post hoc Turkey's test was used for statistical analysis. One-way analysis of variance method was used to compare the mean values. Metal ceramic crowns obtained from direct optical scanning showed the least marginal and internal gap when compared to the castings obtained from inlay casting wax and indirect optical scanning. Indirect and direct optical scanning had yielded results within clinically acceptable range.

  5. Common side closure type, but not stapler brand or oversewing, influences side-to-side anastomotic leak rates.

    PubMed

    Fleetwood, V A; Gross, K N; Alex, G C; Cortina, C S; Smolevitz, J B; Sarvepalli, S; Bakhsh, S R; Poirier, J; Myers, J A; Singer, M A; Orkin, B A

    2017-03-01

    Anastomotic leak (AL) increases costs and cancer recurrence. Studies show decreased AL with side-to-side stapled anastomosis (SSA), but none identify risk factors within SSAs. We hypothesized that stapler characteristics and closure technique of the common enterotomy affect AL rates. Retrospective review of bowel SSAs was performed. Data included stapler brand, staple line oversewing, and closure method (handsewn, HC; linear stapler [Barcelona technique], BT; transverse stapler, TX). Primary endpoint was AL. Statistical analysis included Fisher's test and logistic regression. 463 patients were identified, 58.5% BT, 21.2% HC, and 20.3% TX. Covidien staplers comprised 74.9%, Ethicon 18.1%. There were no differences between stapler types (Covidien 5.8%, Ethicon 6.0%). However, AL rates varied by common side closure (BT 3.7% vs. TX 10.6%, p = 0.017), remaining significant on multivariate analysis. Closure method of the common side impacts AL rates. Barcelona technique has fewer leaks than transverse stapled closure. Further prospective evaluation is recommended. Copyright © 2017. Published by Elsevier Inc.

  6. Mild cognitive impairment and fMRI studies of brain functional connectivity: the state of the art

    PubMed Central

    Farràs-Permanyer, Laia; Guàrdia-Olmos, Joan; Peró-Cebollero, Maribel

    2015-01-01

    In the last 15 years, many articles have studied brain connectivity in Mild Cognitive Impairment patients with fMRI techniques, seemingly using different connectivity statistical models in each investigation to identify complex connectivity structures so as to recognize typical behavior in this type of patient. This diversity in statistical approaches may cause problems in results comparison. This paper seeks to describe how researchers approached the study of brain connectivity in MCI patients using fMRI techniques from 2002 to 2014. The focus is on the statistical analysis proposed by each research group in reference to the limitations and possibilities of those techniques to identify some recommendations to improve the study of functional connectivity. The included articles came from a search of Web of Science and PsycINFO using the following keywords: f MRI, MCI, and functional connectivity. Eighty-one papers were found, but two of them were discarded because of the lack of statistical analysis. Accordingly, 79 articles were included in this review. We summarized some parts of the articles, including the goal of every investigation, the cognitive paradigm and methods used, brain regions involved, use of ROI analysis and statistical analysis, emphasizing on the connectivity estimation model used in each investigation. The present analysis allowed us to confirm the remarkable variability of the statistical analysis methods found. Additionally, the study of brain connectivity in this type of population is not providing, at the moment, any significant information or results related to clinical aspects relevant for prediction and treatment. We propose to follow guidelines for publishing fMRI data that would be a good solution to the problem of study replication. The latter aspect could be important for future publications because a higher homogeneity would benefit the comparison between publications and the generalization of results. PMID:26300802

  7. The relationship between foot posture and plantar pressure during walking in adults: A systematic review.

    PubMed

    Buldt, Andrew K; Allan, Jamie J; Landorf, Karl B; Menz, Hylton B

    2018-02-23

    Foot posture is a risk factor for some lower limb injuries, however the underlying mechanism is not well understood. Plantar pressure analysis is one technique to investigate the interaction between foot posture and biomechanical function of the lower limb. The aim of this review was to investigate the relationship between foot posture and plantar pressure during walking. A systematic database search was conducted using MEDLINE, CINAHL, SPORTDiscus and Embase to identify studies that have assessed the relationship between foot posture and plantar pressure during walking. Included studies were assessed for methodological quality. Meta-analysis was not conducted due to heterogeneity between studies. Inconsistencies included foot posture classification techniques, gait analysis protocols, selection of plantar pressure parameters and statistical analysis approaches. Of the 4213 citations identified for title and abstract review, sixteen studies were included and underwent quality assessment; all were of moderate methodological quality. There was some evidence that planus feet display higher peak pressure, pressure-time integral, maximum force, force-time integral and contact area predominantly in the medial arch, central forefoot and hallux, while these variables are lower in the lateral and medial forefoot. In contrast, cavus feet display higher peak pressure and pressure-time integral in the heel and lateral forefoot, while pressure-time integral, maximum force, force-time integral and contact area are lower for the midfoot and hallux. Centre of pressure was more laterally deviated in cavus feet and more medially deviated in planus feet. Overall, effect sizes were moderate, but regression models could only explain a small amount of variance in plantar pressure variables. Despite these significant findings, future research would benefit from greater methodological rigour, particularly in relation to the use of valid foot posture measurement techniques, gait analysis protocols, and standardised approaches for analysis and reporting of plantar pressure variables. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Vapor phase diamond growth technology

    NASA Technical Reports Server (NTRS)

    Angus, J. C.

    1981-01-01

    Ion beam deposition chambers used for carbon film generation were designed and constructed. Features of the developed equipment include: (1) carbon ion energies down to approx. 50 eV; (2) in suit surface monitoring with HEED; (3) provision for flooding the surface with ultraviolet radiation; (4) infrared laser heating of substrate; (5) residual gas monitoring; (6) provision for several source gases, including diborane for doping studies; and (7) growth from either hydrocarbon source gases or from carbon/argon arc sources. Various analytical techniques for characterization of from carbon/argon arc sources. Various analytical techniques for characterization of the ion deposited carbon films used to establish the nature of the chemical bonding and crystallographic structure of the films are discussed. These include: H2204/HN03 etch; resistance measurements; hardness tests; Fourier transform infrared spectroscopy; scanning auger microscopy; electron spectroscopy for chemical analysis; electron diffraction and energy dispersive X-ray analysis; electron energy loss spectroscopy; density measurements; secondary ion mass spectroscopy; high energy electron diffraction; and electron spin resonance. Results of the tests are summarized.

  9. Fracture control methods for space vehicles. Volume 2: Assessment of fracture mechanics technology for space shuttle applications

    NASA Technical Reports Server (NTRS)

    Ehret, R. M.

    1974-01-01

    The concepts explored in a state of the art review of those engineering fracture mechanics considered most applicable to the space shuttle vehicle include fracture toughness, precritical flaw growth, failure mechanisms, inspection methods (including proof test logic), and crack growth predictive analysis techniques.

  10. Detection of monoclonal immunoglobulin heavy chain gene rearrangement (FR3) in Thai malignant lymphoma by High Resolution Melting curve analysis.

    PubMed

    Kummalue, Tanawan; Chuphrom, Anchalee; Sukpanichanant, Sanya; Pongpruttipan, Tawatchai; Sukpanichanant, Sathien

    2010-05-19

    Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction followed by heteroduplex has currently become standard whereas fluorescent fragment analysis (GeneScan) has been used for confirmation test. In this study, three techniques had been compared: thermocycler polymerase chain reaction (PCR) followed by heteroduplex and polyacrylamide gel electrophoresis, GeneScan analysis, and real time PCR with High Resolution Melting curve analysis (HRM). The comparison was carried out with DNA extracted from paraffin embedded tissues diagnosed as B- cell non-Hodgkin lymphoma. Specific PCR primers sequences for IgH gene variable region 3, including fluorescence labeled IgH primers were used and results were compared with HRM. In conclusion, the detection IgH gene rearrangement by HRM in the LightCycler System showed potential for distinguishing monoclonality from polyclonality in B-cell non-Hodgkin lymphoma. Malignant lymphoma, especially non-Hodgkin lymphoma, is one of the most common hematologic malignancies in Thailand. The incidence rate as reported by Ministry of Public Health is 3.1 per 100,000 population in female whereas the rate in male is 4.5 per 100,000 population 1. At Siriraj Hospital, the new cases diagnosed as malignant lymphoma were 214.6 cases/year 2. The diagnosis of malignant lymphoma is often problematic, especially in early stages of the disease. Therefore, detection of antigen receptor gene rearrangement including T cell receptor (TCR) and immunoglobulin heavy chain (IgH) by polymerase chain reaction (PCR) assay has recently become a standard laboratory test for discrimination of reactive from malignant clonal lymphoproliferation 34. Analyzing DNA extracted from formalin-fixed, paraffin-embedded tissues by multiplex PCR techniques is more rapid, accurate and highly sensitive. Measuring the size of the amplicon from PCR analysis could be used to diagnose malignant lymphoma with monoclonal pattern showing specific and distinct bands detected on acrylamide gel electrophoresis. However, this technique has some limitations and some patients might require a further confirmation test such as GeneScan or fragment analysis 56.GeneScan technique or fragment analysis reflects size and peak of DNA by using capillary gel electrophoresis. This technique is highly sensitive and can detect 0.5-1% of clonal lymphoid cells. It measures the amplicons by using various fluorescently labeled primers at forward or reverse sides and a specific size standard. Using a Genetic Analyzer machine and GeneMapper software (Applied Bioscience, USA), the monoclonal pattern revealed one single, sharp and high peak at the specific size corresponding to acrylamide gel pattern, whereas the polyclonal pattern showed multiple and small peak condensed at the same size standard. This technique is the most sensitive and accurate technique; however, it usually requires high technical experience and is also of high cost 7. Therefore, rapid and more cost effective technique are being sought.LightCycler PCR performs the diagnostic detection of amplicon via melting curve analysis within 2 hours with the use of a specific dye 89. This dye consists of two types: one known as SYBR-Green I which is non specific and the other named as High Resolution Melting analysis (HRM) which is highly sensitive, more accurate and stable. Several reports demonstrated that this new instrument combined with DNA intercalating dyes can be used to discriminate sequence changes in PCR amplicon without manual handling of PCR product 1011. Therefore, current investigations using melting curve analysis are being developed 1213.In this study, three different techniques were compared to evaluate the suitability of LightCycler PCR with HRM as the clonal diagnostic tool for IgH gene rearrangement in B-cell non-Hogdkin lymphoma, i.e. thermocycler PCR followed by heteroduplex analysis and PAGE, GeneScan analysis and LightCycler PCR with HRM.

  11. Surgical management of gynecomastia: an outcome analysis.

    PubMed

    Kasielska, Anna; Antoszewski, Bogusław

    2013-11-01

    The aim of the study was to evaluate the surgical management of gynecomastia focusing on techniques, complications, and aesthetic results. The authors also proposed an evaluation scale of the cosmetic results after the treatment. We conducted a retrospective analysis of 113 patients undergoing the surgery for gynecomastia in our department. Preoperative clinical evaluation included the grade of gynecomastia, its etiology, and side, whereas postoperative analysis concerned histologic findings, complications, and cosmetic results. Operative techniques included subcutaneous mastectomy through circumareolar approach in 94 patients, subcutaneous mastectomy with skin excision in 9 patients, inverted-T reduction mastopexy with nipple-areola complex (NAC) transposition in 6 subjects, and breast amputation through inframammary fold approach with free transplantation of NAC in 4 cases. Complications occurred in a total of 25 patients and did not differ statistically within Simon stages. The operative technique appeared to be the crucial determinant of good aesthetic outcome. The postoperative result of shape and symmetry of the NAC was not as satisfactory as postoperative breast size and symmetry. We showed that subcutaneous mastectomy using a circumareolar incision without additional liposuction provides a good or very good aesthetic outcome in patients with Simon grades I to IIa gynecomastia and that it is challenging to achieve a very good or even a good aesthetic outcome in patients with Simon grades IIb to III gynecomastia.

  12. Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments? A systematic review and meta analysis.

    PubMed

    Al-Moraissi, E A; Elmansi, Y A; Al-Sharaee, Y A; Alrmali, A E; Alkhutari, A S

    2016-03-01

    A systematic review and meta-analysis was conducted to answer the clinical question "Does the piezoelectric surgical technique produce fewer postoperative sequelae after lower third molar surgery than conventional rotary instruments?" A systematic and electronic search of several databases with specific key words, a reference search, and a manual search were performed from respective dates of inception through November 2014. The inclusion criteria were clinical human studies, including randomized controlled trials (RCTs), controlled clinical trials (CCTs), and retrospective studies, with the aim of comparing the piezoelectric surgical osteotomy technique to the standard rotary instrument technique in lower third molar surgery. Postoperative sequelae (oedema, trismus, and pain), the total number of analgesics taken, and the duration of surgery were analyzed. A total of nine articles were included, six RCTs, two CCTs, and one retrospective study. Six studies had a low risk of bias and three had a moderate risk of bias. A statistically significant difference was found between piezoelectric surgery and conventional rotary instrument surgery for lower third molar extraction with regard to postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken (P=0.0001, P=0.0001, P<0.00001, and P<0.0001, respectively). However, a statistically significant increased surgery time was required in the piezoelectric osteotomy group (P<0.00001). The results of the meta-analysis showed that piezoelectric surgery significantly reduced the occurrence of postoperative sequelae (oedema, trismus, and pain) and the total number of analgesics taken compared to the conventional rotary instrument technique in lower third molar surgery, but required a longer surgery time. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Structure identification methods for atomistic simulations of crystalline materials

    DOE PAGES

    Stukowski, Alexander

    2012-05-28

    Here, we discuss existing and new computational analysis techniques to classify local atomic arrangements in large-scale atomistic computer simulations of crystalline solids. This article includes a performance comparison of typical analysis algorithms such as common neighbor analysis (CNA), centrosymmetry analysis, bond angle analysis, bond order analysis and Voronoi analysis. In addition we propose a simple extension to the CNA method that makes it suitable for multi-phase systems. Finally, we introduce a new structure identification algorithm, the neighbor distance analysis, which is designed to identify atomic structure units in grain boundaries.

  14. Anaesthesia Management for Awake Craniotomy: Systematic Review and Meta-Analysis

    PubMed Central

    Rossaint, Rolf; Veldeman, Michael

    2016-01-01

    Background Awake craniotomy (AC) renders an expanded role in functional neurosurgery. Yet, evidence for optimal anaesthesia management remains limited. We aimed to summarise the latest clinical evidence of AC anaesthesia management and explore the relationship of AC failures on the used anaesthesia techniques. Methods Two authors performed independently a systematic search of English articles in PubMed and EMBASE database 1/2007-12/2015. Search included randomised controlled trials (RCTs), observational trials, and case reports (n>4 cases), which reported anaesthetic approach for AC and at least one of our pre-specified outcomes: intraoperative seizures, hypoxia, arterial hypertension, nausea and vomiting, neurological dysfunction, conversion into general anaesthesia and failure of AC. Random effects meta-analysis was used to estimate event rates for four outcomes. Relationship with anaesthesia technique was explored using logistic meta-regression, calculating the odds ratios (OR) and 95% confidence intervals [95%CI]. Results We have included forty-seven studies. Eighteen reported asleep-awake-asleep technique (SAS), twenty-seven monitored anaesthesia care (MAC), one reported both and one used the awake-awake-awake technique (AAA). Proportions of AC failures, intraoperative seizures, new neurological dysfunction and conversion into general anaesthesia (GA) were 2% [95%CI:1–3], 8% [95%CI:6–11], 17% [95%CI:12–23] and 2% [95%CI:2–3], respectively. Meta-regression of SAS and MAC technique did not reveal any relevant differences between outcomes explained by the technique, except for conversion into GA. Estimated OR comparing SAS to MAC for AC failures was 0.98 [95%CI:0.36–2.69], 1.01 [95%CI:0.52–1.88] for seizures, 1.66 [95%CI:1.35–3.70] for new neurological dysfunction and 2.17 [95%CI:1.22–3.85] for conversion into GA. The latter result has to be interpreted cautiously. It is based on one retrospective high-risk of bias study and significance was abolished in a sensitivity analysis of only prospectively conducted studies. Conclusion SAS and MAC techniques were feasible and safe, whereas data for AAA technique are limited. Large RCTs are required to prove superiority of one anaesthetic regime for AC. PMID:27228013

  15. Anaesthesia Management for Awake Craniotomy: Systematic Review and Meta-Analysis.

    PubMed

    Stevanovic, Ana; Rossaint, Rolf; Veldeman, Michael; Bilotta, Federico; Coburn, Mark

    2016-01-01

    Awake craniotomy (AC) renders an expanded role in functional neurosurgery. Yet, evidence for optimal anaesthesia management remains limited. We aimed to summarise the latest clinical evidence of AC anaesthesia management and explore the relationship of AC failures on the used anaesthesia techniques. Two authors performed independently a systematic search of English articles in PubMed and EMBASE database 1/2007-12/2015. Search included randomised controlled trials (RCTs), observational trials, and case reports (n>4 cases), which reported anaesthetic approach for AC and at least one of our pre-specified outcomes: intraoperative seizures, hypoxia, arterial hypertension, nausea and vomiting, neurological dysfunction, conversion into general anaesthesia and failure of AC. Random effects meta-analysis was used to estimate event rates for four outcomes. Relationship with anaesthesia technique was explored using logistic meta-regression, calculating the odds ratios (OR) and 95% confidence intervals [95%CI]. We have included forty-seven studies. Eighteen reported asleep-awake-asleep technique (SAS), twenty-seven monitored anaesthesia care (MAC), one reported both and one used the awake-awake-awake technique (AAA). Proportions of AC failures, intraoperative seizures, new neurological dysfunction and conversion into general anaesthesia (GA) were 2% [95%CI:1-3], 8% [95%CI:6-11], 17% [95%CI:12-23] and 2% [95%CI:2-3], respectively. Meta-regression of SAS and MAC technique did not reveal any relevant differences between outcomes explained by the technique, except for conversion into GA. Estimated OR comparing SAS to MAC for AC failures was 0.98 [95%CI:0.36-2.69], 1.01 [95%CI:0.52-1.88] for seizures, 1.66 [95%CI:1.35-3.70] for new neurological dysfunction and 2.17 [95%CI:1.22-3.85] for conversion into GA. The latter result has to be interpreted cautiously. It is based on one retrospective high-risk of bias study and significance was abolished in a sensitivity analysis of only prospectively conducted studies. SAS and MAC techniques were feasible and safe, whereas data for AAA technique are limited. Large RCTs are required to prove superiority of one anaesthetic regime for AC.

  16. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  17. Robotic-assisted laparoscopic repair of ureteral injury: an evidence-based review of techniques and outcomes.

    PubMed

    Tracey, Andrew T; Eun, Daniel D; Stifelman, Michael D; Hemal, Ashok K; Stein, Robert J; Mottrie, Alexandre; Cadeddu, Jeffrey A; Stolzenburg, J Uwe; Berger, Andre K; Buffi, Niccolò; Zhao, Lee C; Lee, Ziho; Hampton, Lance; Porpiglia, Francesco; Autorino, Riccardo

    2018-06-01

    Iatrogenic ureteral injuries represent a common surgical problem encountered by practicing urologists. With the rapidly expanding applications of robotic-assisted laparoscopic surgery, ureteral reconstruction has been an important field of recent advancement. This collaborative review sought to provide an evidence-based analysis of the latest surgical techniques and outcomes for robotic-assisted repair of ureteral injury. A systematic review of the literature up to December 2017 using PubMed/Medline was performed to identify relevant articles. Those studies included in the systematic review were selected according to Preferred Reporting Items for Systematic Reviews and Meta-analysis criteria. Additionally, expert opinions were included from study authors in order to critique outcomes and elaborate on surgical techniques. A cumulative outcome analysis was conducted analyzing comparative studies on robotic versus open ureteral repair. Thirteen case series have demonstrated the feasibility, safety, and success of robotic ureteral reconstruction. The surgical planning, timing of intervention, and various robotic reconstructive techniques need to be tailored to the specific case, depending on the location and length of the injury. Fluorescence imaging can represent a useful tool in this setting. Recently, three studies have shown the feasibility and technical success of robotic buccal mucosa grafting for ureteral repair. Soon, additional novel and experimental robotic reconstructive approaches might become available. The cumulative analysis of the three available comparative studies on robotic versus open ureteral repair showed no difference in operative time or complication rate, with a decreased blood loss and hospital length of stay favoring the robotic approach. Current evidence suggests that the robotic surgical platform facilitates complex ureteral reconstruction in a minimally invasive fashion. High success rates of ureteral repair using the robotic approach mirror those of open surgery, with the additional advantage of faster recovery. Novel techniques in development and surgical adjuncts show promise as the role of robotic surgery evolves.

  18. NASA/Howard University Large Space Structures Institute

    NASA Technical Reports Server (NTRS)

    Broome, T. H., Jr.

    1984-01-01

    Basic research on the engineering behavior of large space structures is presented. Methods of structural analysis, control, and optimization of large flexible systems are examined. Topics of investigation include the Load Correction Method (LCM) modeling technique, stabilization of flexible bodies by feedback control, mathematical refinement of analysis equations, optimization of the design of structural components, deployment dynamics, and the use of microprocessors in attitude and shape control of large space structures. Information on key personnel, budgeting, support plans and conferences is included.

  19. Performance analysis of the ascent propulsion system of the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Hooper, J. C., III

    1973-01-01

    Activities involved in the performance analysis of the Apollo lunar module ascent propulsion system are discussed. A description of the ascent propulsion system, including hardware, instrumentation, and system characteristics, is included. The methods used to predict the inflight performance and to establish performance uncertainties of the ascent propulsion system are discussed. The techniques of processing the telemetered flight data and performing postflight performance reconstruction to determine actual inflight performance are discussed. Problems that have been encountered and results from the analysis of the ascent propulsion system performance during the Apollo 9, 10, and 11 missions are presented.

  20. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  1. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. An assessment of PERT as a technique for schedule planning and control

    NASA Technical Reports Server (NTRS)

    Sibbers, C. W.

    1982-01-01

    The PERT technique including the types of reports which can be computer generated using the NASA/LaRC PPARS System is described. An assessment is made of the effectiveness of PERT on various types of efforts as well as for specific purposes, namely, schedule planning, schedule analysis, schedule control, monitoring contractor schedule performance, and management reporting. This assessment is based primarily on the author's knowledge of the usage of PERT by NASA/LaRC personnel since the early 1960's. Both strengths and weaknesses of the technique for various applications are discussed. It is intended to serve as a reference guide for personnel performing project planning and control functions and technical personnel whose responsibilities either include schedule planning and control or require a general knowledge of the subject.

  3. Acupuncture-Related Techniques for Psoriasis: A Systematic Review with Pairwise and Network Meta-Analyses of Randomized Controlled Trials.

    PubMed

    Yeh, Mei-Ling; Ko, Shu-Hua; Wang, Mei-Hua; Chi, Ching-Chi; Chung, Yu-Chu

    2017-12-01

    There has be a large body of evidence on the pharmacological treatments for psoriasis, but whether nonpharmacological interventions are effective in managing psoriasis remains largely unclear. This systematic review conducted pairwise and network meta-analyses to determine the effects of acupuncture-related techniques on acupoint stimulation for the treatment of psoriasis and to determine the order of effectiveness of these remedies. This study searched the following databases from inception to March 15, 2016: Medline, PubMed, Cochrane Central Register of Controlled Trials, EBSCO (including Academic Search Premier, American Doctoral Dissertations, and CINAHL), Airiti Library, and China National Knowledge Infrastructure. Randomized controlled trials (RCTs) on the effects of acupuncture-related techniques on acupoint stimulation as intervention for psoriasis were independently reviewed by two researchers. A total of 13 RCTs with 1,060 participants were included. The methodological quality of included studies was not rigorous. Acupoint stimulation, compared with nonacupoint stimulation, had a significant treatment for psoriasis. However, the most common adverse events were thirst and dry mouth. Subgroup analysis was further done to confirm that the short-term treatment effect was superior to that of the long-term effect in treating psoriasis. Network meta-analysis identified acupressure or acupoint catgut embedding, compared with medication, and had a significant effect for improving psoriasis. It was noted that acupressure was the most effective treatment. Acupuncture-related techniques could be considered as an alternative or adjuvant therapy for psoriasis in short term, especially of acupressure and acupoint catgut embedding. This study recommends further well-designed, methodologically rigorous, and more head-to-head randomized trials to explore the effects of acupuncture-related techniques for treating psoriasis.

  4. Analyzing thematic maps and mapping for accuracy

    USGS Publications Warehouse

    Rosenfield, G.H.

    1982-01-01

    Two problems which exist while attempting to test the accuracy of thematic maps and mapping are: (1) evaluating the accuracy of thematic content, and (2) evaluating the effects of the variables on thematic mapping. Statistical analysis techniques are applicable to both these problems and include techniques for sampling the data and determining their accuracy. In addition, techniques for hypothesis testing, or inferential statistics, are used when comparing the effects of variables. A comprehensive and valid accuracy test of a classification project, such as thematic mapping from remotely sensed data, includes the following components of statistical analysis: (1) sample design, including the sample distribution, sample size, size of the sample unit, and sampling procedure; and (2) accuracy estimation, including estimation of the variance and confidence limits. Careful consideration must be given to the minimum sample size necessary to validate the accuracy of a given. classification category. The results of an accuracy test are presented in a contingency table sometimes called a classification error matrix. Usually the rows represent the interpretation, and the columns represent the verification. The diagonal elements represent the correct classifications. The remaining elements of the rows represent errors by commission, and the remaining elements of the columns represent the errors of omission. For tests of hypothesis that compare variables, the general practice has been to use only the diagonal elements from several related classification error matrices. These data are arranged in the form of another contingency table. The columns of the table represent the different variables being compared, such as different scales of mapping. The rows represent the blocking characteristics, such as the various categories of classification. The values in the cells of the tables might be the counts of correct classification or the binomial proportions of these counts divided by either the row totals or the column totals from the original classification error matrices. In hypothesis testing, when the results of tests of multiple sample cases prove to be significant, some form of statistical test must be used to separate any results that differ significantly from the others. In the past, many analyses of the data in this error matrix were made by comparing the relative magnitudes of the percentage of correct classifications, for either individual categories, the entire map or both. More rigorous analyses have used data transformations and (or) two-way classification analysis of variance. A more sophisticated step of data analysis techniques would be to use the entire classification error matrices using the methods of discrete multivariate analysis or of multiviariate analysis of variance.

  5. Comparison of analysis and flight test data for a drone aircraft with active flutter suppression

    NASA Technical Reports Server (NTRS)

    Newsom, J. R.; Pototzky, A. S.

    1981-01-01

    A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.

  6. Summary of CPAS EDU Testing Analysis Results

    NASA Technical Reports Server (NTRS)

    Romero, Leah M.; Bledsoe, Kristin J.; Davidson, John.; Engert, Meagan E.; Fraire, Usbaldo, Jr.; Galaviz, Fernando S.; Galvin, Patrick J.; Ray, Eric S.; Varela, Jose

    2015-01-01

    The Orion program's Capsule Parachute Assembly System (CPAS) project is currently conducting its third generation of testing, the Engineering Development Unit (EDU) series. This series utilizes two test articles, a dart-shaped Parachute Compartment Drop Test Vehicle (PCDTV) and capsule-shaped Parachute Test Vehicle (PTV), both of which include a full size, flight-like parachute system and require a pallet delivery system for aircraft extraction. To date, 15 tests have been completed, including six with PCDTVs and nine with PTVs. Two of the PTV tests included the Forward Bay Cover (FBC) provided by Lockheed Martin. Advancements in modeling techniques applicable to parachute fly-out, vehicle rate of descent, torque, and load train, also occurred during the EDU testing series. An upgrade from a composite to an independent parachute simulation allowed parachute modeling at a higher level of fidelity than during previous generations. The complexity of separating the test vehicles from their pallet delivery systems necessitated the use the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulator for modeling mated vehicle aircraft extraction and separation. This paper gives an overview of each EDU test and summarizes the development of CPAS analysis tools and techniques during EDU testing.

  7. Lubrication Flows.

    ERIC Educational Resources Information Center

    Papanastasiou, Tasos C.

    1989-01-01

    Discusses fluid mechanics for undergraduates including the differential Navier-Stokes equations, dimensional analysis and simplified dimensionless numbers, control volume principles, the Reynolds lubrication equation for confined and free surface flows, capillary pressure, and simplified perturbation techniques. Provides a vertical dip coating…

  8. System data communication structures for active-control transport aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.

  9. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  10. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  11. Ultrasound-guided injection for MR arthrography of the hip: comparison of two different techniques.

    PubMed

    Kantarci, Fatih; Ozbayrak, Mustafa; Gulsen, Fatih; Gencturk, Mert; Botanlioglu, Huseyin; Mihmanli, Ismail

    2013-01-01

    The purpose of this study was to prospectively evaluate the two different ultrasound-guided injection techniques for MR arthrography of the hip. Fifty-nine consecutive patients (21 men, 38 women) referred for MR arthrographies of the hip were prospectively included in the study. Three patients underwent bilateral MR arthrography. The two injection techniques were quantitatively and qualitatively compared. Quantitative analysis was performed by the comparison of injected contrast material volume into the hip joint. Qualitative analysis was performed with regard to extraarticular leakage of contrast material into the soft tissues. Extraarticular leakage of contrast material was graded as none, minimal, moderate, or severe according to the MR images. Each patient rated discomfort after the procedure using a visual analogue scale (VAS). The injected contrast material volume was less in femoral head puncture technique (mean 8.9 ± 3.4 ml) when compared to femoral neck puncture technique (mean 11.2 ± 2.9 ml) (p < 0.05). The chi-squared test showed significantly more contrast leakage by femoral head puncture technique (p < 0.05). Statistical analysis showed no difference between the head and neck puncture groups in terms of feeling of pain (p = 0.744) or in the body mass index (p = 0.658) of the patients. The femoral neck injection technique provides high intraarticular contrast volume and produces less extraarticular contrast leakage than the femoral head injection technique when US guidance is used for MR arthrography of the hip.

  12. Who Should Bear the Cost of Convenience? A Cost-effectiveness Analysis Comparing External Beam and Brachytherapy Radiotherapy Techniques for Early Stage Breast Cancer.

    PubMed

    McGuffin, M; Merino, T; Keller, B; Pignol, J-P

    2017-03-01

    Standard treatment for early breast cancer includes whole breast irradiation (WBI) after breast-conserving surgery. Recently, accelerated partial breast irradiation (APBI) has been proposed for well-selected patients. A cost and cost-effectiveness analysis was carried out comparing WBI with two APBI techniques. An activity-based costing method was used to determine the treatment cost from a societal perspective of WBI, high dose rate brachytherapy (HDR) and permanent breast seed implants (PBSI). A Markov model comparing the three techniques was developed with downstream costs, utilities and probabilities adapted from the literature. Sensitivity analyses were carried out for a wide range of variables, including treatment costs, patient costs, utilities and probability of developing recurrences. Overall, HDR was the most expensive ($14 400), followed by PBSI ($8700), with WBI proving the least expensive ($6200). The least costly method to the health care system was WBI, whereas PBSI and HDR were less costly for the patient. Under cost-effectiveness analyses, downstream costs added about $10 000 to the total societal cost of the treatment. As the outcomes are very similar between techniques, WBI dominated under cost-effectiveness analyses. WBI was found to be the most cost-effective radiotherapy technique for early breast cancer. However, both APBI techniques were less costly to the patient. Although innovation may increase costs for the health care system it can provide cost savings for the patient in addition to convenience. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  13. Directional analysis and filtering for dust storm detection in NOAA-AVHRR imagery

    NASA Astrophysics Data System (ADS)

    Janugani, S.; Jayaram, V.; Cabrera, S. D.; Rosiles, J. G.; Gill, T. E.; Rivera Rivera, N.

    2009-05-01

    In this paper, we propose spatio-spectral processing techniques for the detection of dust storms and automatically finding its transport direction in 5-band NOAA-AVHRR imagery. Previous methods that use simple band math analysis have produced promising results but have drawbacks in producing consistent results when low signal to noise ratio (SNR) images are used. Moreover, in seeking to automate the dust storm detection, the presence of clouds in the vicinity of the dust storm creates a challenge in being able to distinguish these two types of image texture. This paper not only addresses the detection of the dust storm in the imagery, it also attempts to find the transport direction and the location of the sources of the dust storm. We propose a spatio-spectral processing approach with two components: visualization and automation. Both approaches are based on digital image processing techniques including directional analysis and filtering. The visualization technique is intended to enhance the image in order to locate the dust sources. The automation technique is proposed to detect the transport direction of the dust storm. These techniques can be used in a system to provide timely warnings of dust storms or hazard assessments for transportation, aviation, environmental safety, and public health.

  14. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    PubMed

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  15. Comparison of seven techniques for typing international epidemic strains of Clostridium difficile: restriction endonuclease analysis, pulsed-field gel electrophoresis, PCR-ribotyping, multilocus sequence typing, multilocus variable-number tandem-repeat analysis, amplified fragment length polymorphism, and surface layer protein A gene sequence typing.

    PubMed

    Killgore, George; Thompson, Angela; Johnson, Stuart; Brazier, Jon; Kuijper, Ed; Pepin, Jacques; Frost, Eric H; Savelkoul, Paul; Nicholson, Brad; van den Berg, Renate J; Kato, Haru; Sambol, Susan P; Zukowski, Walter; Woods, Christopher; Limbago, Brandi; Gerding, Dale N; McDonald, L Clifford

    2008-02-01

    Using 42 isolates contributed by laboratories in Canada, The Netherlands, the United Kingdom, and the United States, we compared the results of analyses done with seven Clostridium difficile typing techniques: multilocus variable-number tandem-repeat analysis (MLVA), amplified fragment length polymorphism (AFLP), surface layer protein A gene sequence typing (slpAST), PCR-ribotyping, restriction endonuclease analysis (REA), multilocus sequence typing (MLST), and pulsed-field gel electrophoresis (PFGE). We assessed the discriminating ability and typeability of each technique as well as the agreement among techniques in grouping isolates by allele profile A (AP-A) through AP-F, which are defined by toxinotype, the presence of the binary toxin gene, and deletion in the tcdC gene. We found that all isolates were typeable by all techniques and that discrimination index scores for the techniques tested ranged from 0.964 to 0.631 in the following order: MLVA, REA, PFGE, slpAST, PCR-ribotyping, MLST, and AFLP. All the techniques were able to distinguish the current epidemic strain of C. difficile (BI/027/NAP1) from other strains. All of the techniques showed multiple types for AP-A (toxinotype 0, binary toxin negative, and no tcdC gene deletion). REA, slpAST, MLST, and PCR-ribotyping all included AP-B (toxinotype III, binary toxin positive, and an 18-bp deletion in tcdC) in a single group that excluded other APs. PFGE, AFLP, and MLVA grouped two, one, and two different non-AP-B isolates, respectively, with their AP-B isolates. All techniques appear to be capable of detecting outbreak strains, but only REA and MLVA showed sufficient discrimination to distinguish strains from different outbreaks.

  16. Cytological Analysis of Meiosis in Caenorhabditis elegans

    PubMed Central

    Phillips, Carolyn M.; McDonald, Kent L.; Dernburg, Abby F.

    2011-01-01

    The nematode Caenorhabditis elegans has emerged as an informative experimental system for analysis of meiosis, in large part because of the advantageous physical organization of meiotic nuclei as a gradient of stages within the germline. Here we provide tools for detailed observational studies of cells within the worm gonad, including techniques for light and electron microscopy. PMID:19685325

  17. Introduction to LISREL: A Demonstration Using Students' Commitment to an Institution. ASHE 1987 Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Stage, Frances K.

    The nature and use of LISREL (LInear Structural RELationships) analysis are considered, including an examination of college students' commitment to a university. LISREL is a fairly new causal analysis technique that has broad application in the social sciences and that employs structural equation estimation. The application examined in this paper…

  18. An Information-Systems Program for the Language Sciences. Final Report on Survey-and-Analysis Stage, 1967-1968.

    ERIC Educational Resources Information Center

    Freeman, Robert R.; And Others

    The main results of the survey-and-analysis stage include a substantial collection of preliminary data on the language-sciences information user community, its professional specialties and information channels, its indexing tools, and its terminologies. The prospects and techniques for the development of a modern, discipline-based information…

  19. Errata: Response Analysis and Error Diagnosis Tools.

    ERIC Educational Resources Information Center

    Hart, Robert S.

    This guide to ERRATA, a set of HyperCard-based tools for response analysis and error diagnosis in language testing, is intended as a user manual and general reference and designed to be used with the software (not included here). It has three parts. The first is a brief survey of computational techniques available for dealing with student test…

  20. Recent Advances in the Measurement of Arsenic, Cadmium, and Mercury in Rice and Other Foods

    PubMed Central

    Punshon, Tracy

    2015-01-01

    Trace element analysis of foods is of increasing importance because of raised consumer awareness and the need to evaluate and establish regulatory guidelines for toxic trace metals and metalloids. This paper reviews recent advances in the analysis of trace elements in food, including challenges, state-of-the art methods, and use of spatially resolved techniques for localizing the distribution of As and Hg within rice grains. Total elemental analysis of foods is relatively well-established but the push for ever lower detection limits requires that methods be robust from potential matrix interferences which can be particularly severe for food. Inductively coupled plasma mass spectrometry (ICP-MS) is the method of choice, allowing for multi-element and highly sensitive analyses. For arsenic, speciation analysis is necessary because the inorganic forms are more likely to be subject to regulatory limits. Chromatographic techniques coupled to ICP-MS are most often used for arsenic speciation and a range of methods now exist for a variety of different arsenic species in different food matrices. Speciation and spatial analysis of foods, especially rice, can also be achieved with synchrotron techniques. Sensitive analytical techniques and methodological advances provide robust methods for the assessment of several metals in animal and plant-based foods, in particular for arsenic, cadmium and mercury in rice and arsenic speciation in foodstuffs. PMID:25938012

  1. A review of recent studies on the mechanisms and analysis methods of sub-synchronous oscillation in wind farms

    NASA Astrophysics Data System (ADS)

    Wang, Chenggen; Zhou, Qian; Gao, Shuning; Luo, Jia; Diao, Junchao; Zhao, Haoran; Bu, Jing

    2018-04-01

    This paper reviews the recent studies of Sub-Synchronous Oscillation(SSO) in wind farms. Mechanisms and analysis methods are the main concerns of this article. A classification method including new types of oscillation occurred between wind farms and HVDC systems and oscillation caused by Permanent Magnet Synchronous Generators(PMSG) is proposed. Characteristics of oscillation analysis techniques are summarized.

  2. OAO battery data analysis

    NASA Technical Reports Server (NTRS)

    Gaston, S.; Wertheim, M.; Orourke, J. A.

    1973-01-01

    Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.

  3. Rare cell isolation and analysis in microfluidics

    PubMed Central

    Chen, Yuchao; Li, Peng; Huang, Po-Hsun; Xie, Yuliang; Mai, John D.; Wang, Lin; Nguyen, Nam-Trung; Huang, Tony Jun

    2014-01-01

    Rare cells are low-abundance cells in a much larger population of background cells. Conventional benchtop techniques have limited capabilities to isolate and analyze rare cells because of their generally low selectivity and significant sample loss. Recent rapid advances in microfluidics have been providing robust solutions to the challenges in the isolation and analysis of rare cells. In addition to the apparent performance enhancements resulting in higher efficiencies and sensitivity levels, microfluidics provides other advanced features such as simpler handling of small sample volumes and multiplexing capabilities for high-throughput processing. All of these advantages make microfluidics an excellent platform to deal with the transport, isolation, and analysis of rare cells. Various cellular biomarkers, including physical properties, dielectric properties, as well as immunoaffinities, have been explored for isolating rare cells. In this Focus article, we discuss the design considerations of representative microfluidic devices for rare cell isolation and analysis. Examples from recently published works are discussed to highlight the advantages and limitations of the different techniques. Various applications of these techniques are then introduced. Finally, a perspective on the development trends and promising research directions in this field are proposed. PMID:24406985

  4. Histology image analysis for carcinoma detection and grading

    PubMed Central

    He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.

    2012-01-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890

  5. Analysis of Compounds Dissolved in Nonpolar Solvents by Electrospray Ionization on Conductive Nanomaterials

    NASA Astrophysics Data System (ADS)

    Xia, Bing; Gao, Yuanji; Ji, Baocheng; Ma, Fengwei; Ding, Lisheng; Zhou, Yan

    2018-03-01

    Electrospray ionization mass spectrometry (ESI-MS) technique has limitations in analysis of compounds that are dissolved in nonpolar solvents. In this study, ambient ionization of compounds in solvents that are not "friendly" to electrospray ionization, such as n-hexane, is achieved by conductive nanomaterials spray ionization (CNMSI) on nanomaterial emitters, including carbon nanotubes paper and mesodendritic silver covered metal, which applies high voltages to emitters made of these materials without the assistance of polar solvents. Although the time intensity curves (TIC) commonly vary from 4.5% to 23.7% over analyses, protonated molecular ions were found to be the most abundant species, demonstrating good reproducibility of the technique in terms of ionized species. Higher mass spectrometric responses are observed in analyzing nonpolar systems than polar systems. 2-Methoxyacetophenone, 4-methylacetophenone, benzothiazole, quinolone, and cycloheptanone as low as 2 pg in n-hexane can be directly detected using the developed method. The developed technique expands the analysis capability of ESI-MS for direct, online analysis of nonpolar systems, such as low polarity extracts, normal phase liquid chromatography eluates, and synthetic mixtures. [Figure not available: see fulltext.

  6. Pros and cons of conjoint analysis of discrete choice experiments to define classification and response criteria in rheumatology.

    PubMed

    Taylor, William J

    2016-03-01

    Conjoint analysis of choice or preference data has been used in marketing for over 40 years but has appeared in healthcare settings much more recently. It may be a useful technique for applications within the rheumatology field. Conjoint analysis in rheumatology contexts has mainly used the approaches implemented in 1000Minds Ltd, Dunedin, New Zealand, Sawtooth Software, Orem UT, USA. Examples include classification criteria, composite response criteria, service prioritization tools and utilities assessment. Limitations imposed by very many attributes can be managed using new techniques. Conjoint analysis studies of classification and response criteria suggest that the assumption of equal weighting of attributes cannot be met, which challenges traditional approaches to composite criteria construction. Weights elicited through choice experiments with experts can derive more accurate classification criteria, than unweighted criteria. Studies that find significant variation in attribute weights for composite response criteria for gout make construction of such criteria problematic. Better understanding of various multiattribute phenomena is likely to increase with increased use of conjoint analysis, especially when the attributes concern individual perceptions or opinions. In addition to classification criteria, some applications for conjoint analysis that are emerging in rheumatology include prioritization tools, remission criteria, and utilities for life areas.

  7. Active cycle of breathing technique for cystic fibrosis.

    PubMed

    Mckoy, Naomi A; Wilson, Lisa M; Saldanha, Ian J; Odelola, Olaide A; Robinson, Karen A

    2016-07-05

    People with cystic fibrosis experience chronic airway infections as a result of mucus build up within the lungs. Repeated infections often cause lung damage and disease. Airway clearance therapies aim to improve mucus clearance, increase sputum production, and improve airway function. The active cycle of breathing technique (also known as ACBT) is an airway clearance method that uses a cycle of techniques to loosen airway secretions including breathing control, thoracic expansion exercises, and the forced expiration technique. This is an update of a previously published review. To compare the clinical effectiveness of the active cycle of breathing technique with other airway clearance therapies in cystic fibrosis. We searched the Cochrane Cystic Fibrosis Trials Register, compiled from electronic database searches and handsearching of journals and conference abstract books. We also searched the reference lists of relevant articles and reviews.Date of last search: 25 April 2016. Randomised or quasi-randomised controlled clinical studies, including cross-over studies, comparing the active cycle of breathing technique with other airway clearance therapies in cystic fibrosis. Two review authors independently screened each article, abstracted data and assessed the risk of bias of each study. Our search identified 62 studies, of which 19 (440 participants) met the inclusion criteria. Five randomised controlled studies (192 participants) were included in the meta-analysis; three were of cross-over design. The 14 remaining studies were cross-over studies with inadequate reports for complete assessment. The study size ranged from seven to 65 participants. The age of the participants ranged from six to 63 years (mean age 22.33 years). In 13 studies, follow up lasted a single day. However, there were two long-term randomised controlled studies with follow up of one to three years. Most of the studies did not report on key quality items, and therefore, have an unclear risk of bias in terms of random sequence generation, allocation concealment, and outcome assessor blinding. Due to the nature of the intervention, none of the studies blinded participants or the personnel applying the interventions. However, most of the studies reported on all planned outcomes, had adequate follow up, assessed compliance, and used an intention-to-treat analysis.Included studies compared the active cycle of breathing technique with autogenic drainage, airway oscillating devices, high frequency chest compression devices, conventional chest physiotherapy, and positive expiratory pressure. Preference of technique varied: more participants preferred autogenic drainage over the active cycle of breathing technique; more preferred the active cycle of breathing technique over airway oscillating devices; and more were comfortable with the active cycle of breathing technique versus high frequency chest compression. No significant difference was seen in quality of life, sputum weight, exercise tolerance, lung function, or oxygen saturation between the active cycle of breathing technique and autogenic drainage or between the active cycle of breathing technique and airway oscillating devices. There was no significant difference in lung function and the number of pulmonary exacerbations between the active cycle of breathing technique alone or in conjunction with conventional chest physiotherapy. All other outcomes were either not measured or had insufficient data for analysis. There is insufficient evidence to support or reject the use of the active cycle of breathing technique over any other airway clearance therapy. Five studies, with data from eight different comparators, found that the active cycle of breathing technique was comparable with other therapies in outcomes such as participant preference, quality of life, exercise tolerance, lung function, sputum weight, oxygen saturation, and number of pulmonary exacerbations. Longer-term studies are needed to more adequately assess the effects of the active cycle of breathing technique on outcomes important for people with cystic fibrosis such as quality of life and preference.

  8. Linear Covariance Analysis For Proximity Operations Around Asteroid 2008 EV5

    NASA Technical Reports Server (NTRS)

    Wright, Cinnamon A.; Bhatt, Sagar; Woffinden, David; Strube, Matthew; D'Souza, Chris

    2015-01-01

    The NASA initiative to collect an asteroid, the Asteroid Robotic Redirect Mission (ARRM), is currently investigating the option of retrieving a boulder from an asteroid, demonstrating planetary defense with an enhanced gravity tractor technique, and returning it to a lunar orbit. Techniques for accomplishing this are being investigated by the Satellite Servicing Capabilities Office (SSCO) at NASA GSFC in collaboration with JPL, NASA JSC, LaRC, and Draper Laboratory, Inc. Two critical phases of the mission are the descent to the boulder and the Enhanced Gravity Tractor demonstration. A linear covariance analysis is done for these phases to assess the feasibility of these concepts with the proposed design of the sensor and actuator suite of the Asteroid Redirect Vehicle (ARV). The sensor suite for this analysis includes a wide field of view camera, LiDAR, and an IMU. The proposed asteroid of interest is currently the C-type asteroid 2008 EV5, a carbonaceous chondrite that is of high interest to the scientific community. This paper presents an overview of the linear covariance analysis techniques and simulation tool, provides sensor and actuator models, and addresses the feasibility of descending to the surface of the asteroid within allocated requirements as well as the possibility of maintaining a halo orbit to demonstrate the Enhanced Gravity Tractor technique.

  9. The Sixth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Sixth Annual Thermal and Fluids Analysis Workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysis. Paper topics included advances an uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  10. Analysis and application of Fourier transform spectroscopy in atmospheric remote sensing

    NASA Technical Reports Server (NTRS)

    Park, J. H.

    1984-01-01

    An analysis method for Fourier transform spectroscopy is summarized with applications to various types of distortion in atmospheric absorption spectra. This analysis method includes the fast Fourier transform method for simulating the interferometric spectrum and the nonlinear least-squares method for retrieving the information from a measured spectrum. It is shown that spectral distortions can be simulated quite well and that the correct information can be retrieved from a distorted spectrum by this analysis technique.

  11. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  12. [Authentication of Trace Material Evidence in Forensic Science Field with Infrared Microscopic Technique].

    PubMed

    Jiang, Zhi-quan; Hu, Ke-liang

    2016-03-01

    In the field of forensic science, conventional infrared spectral analysis technique is usually unable to meet the detection requirements, because only very a few trace material evidence with diverse shapes and complex compositions, can be extracted from the crime scene. Infrared microscopic technique is developed based on a combination of Fourier-transform infrared spectroscopic technique and microscopic technique. Infrared microscopic technique has a lot of advantages over conventional infrared spectroscopic technique, such as high detection sensitivity, micro-area analysisand nondestructive examination. It has effectively solved the problem of authentication of trace material evidence in the field of forensic science. Additionally, almost no external interference is introduced during measurements by infrared microscopic technique. It can satisfy the special need that the trace material evidence must be reserved for witness in court. It is illustrated in detail through real case analysis in this experimental center that, infrared microscopic technique has advantages in authentication of trace material evidence in forensic science field. In this paper, the vibration features in infrared spectra of material evidences, including paints, plastics, rubbers, fibers, drugs and toxicants, can be comparatively analyzed by means of infrared microscopic technique, in an attempt to provide powerful spectroscopic evidence for qualitative diagnosis of various criminal and traffic accident cases. The experimental results clearly suggest that infrared microscopic technique has an incomparable advantage and it has become an effective method for authentication of trace material evidence in the field of forensic science.

  13. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  14. X-ray microanalysis in the scanning electron microscope.

    PubMed

    Roomans, Godfried M; Dragomir, Anca

    2014-01-01

    X-ray microanalysis conducted using the scanning electron microscope is a technique that allows the determination of chemical elements in bulk or semi-thick specimens. The lowest concentration of an element that can be detected is in the order of a few mmol/kg or a few hundred parts per million, and the smallest amount is in the order of 10(-18) g. The spatial resolution of the analysis depends on the thickness of the specimen. For biological specimen analysis, care must be taken to prevent displacement/loss of the element of interest (usually ions). Protocols are presented for the processing of frozen-hydrated and freeze-dried specimens, as well as for the analysis of small volumes of fluid, cell cultures, and other specimens. Aspects of qualitative and quantitative analysis are covered, including limitations of the technique.

  15. X-ray microanalysis in the scanning electron microscope.

    PubMed

    Roomans, Godfried M; Dragomir, Anca

    2007-01-01

    X-ray microanalysis conducted using the scanning electron microscope is a technique that allows the determination of chemical elements in bulk or semithick specimens. The lowest concentration of an element that can be detected is in the order of a few mmol/kg or a few hundred parts per million, and the smallest amount is in the order of 10(-18) g. The spatial resolution of the analysis depends on the thickness of the specimen. For biological specimen analysis, care must be taken to prevent displacement/loss of the element of interest (usually ions). Protocols are presented for the processing of frozen-hydrated and freeze-dried specimens, as well as for the analysis of small volumes of fluid, cell cultures and other specimens. Aspects of qualitative and quantitative analysis are covered, including limitations of the technique.

  16. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  17. Clinical and biological analysis in graftless maxillary sinus lift.

    PubMed

    Parra, Marcelo; Olate, Sergio; Cantín, Mario

    2017-08-01

    Maxillary sinus lift for dental implant installation is a well-known and versatile technique; new techniques are presented based on the physiology of intrasinus bone repair. The aim of this review was to determine the status of graftless maxillary sinus lift and analyze its foundations and results. A search was conducted of the literature between 1995 and 2015 in the Medline, ScienceDirect, and SciELO databases using the keywords "maxillary sinus lift," "blood clot," "graftless maxillary sinus augmentation," and "dental implant placement." Ten articles were selected for our analysis of this technique and its results. Despite the limited information, cases that were followed for at least six months and up to four years had a 90% success rate. Published techniques included a lateral window, elevation of the sinus membrane, drilling and dental implant installation, descent of the membrane with variations in the installation of the lateral wall access and suturing. The physiology behind this new bone formation response and the results of the present research were also discussed. We concluded that this is a promising and viable technique under certain inclusion criteria.

  18. Cocrystal screening of hydroxybenzamides with benzoic acid derivatives: a comparative study of thermal and solution-based methods.

    PubMed

    Manin, Alex N; Voronin, Alexander P; Drozd, Ksenia V; Manin, Nikolay G; Bauer-Brandl, Annette; Perlovich, German L

    2014-12-18

    The main problem occurring at the early stages of cocrystal search is the choice of an effective screening technique. Among the most popular techniques of obtaining cocrystals are crystallization from solution, crystallization from melt and solvent-drop grinding. This paper represents a comparative analysis of the following screening techniques: DSC cocrystal screening method, thermal microscopy and saturation temperature method. The efficiency of different techniques of cocrystal screening was checked in 18 systems. Benzamide and benzoic acid derivatives were chosen as model systems due to their ability to form acid-amide supramolecular heterosynthon. The screening has confirmed the formation of 6 new cocrystals. The screening by the saturation temperature method has the highest screen-out rate but the smallest range of application. DSC screening has a satisfactory accuracy and allows screening over a short time. Thermal microscopy is most efficient as an additional technique used to interpret ambiguous DSC screening results. The study also included an analysis of the influence of solvent type and component solubility on cocrystal formation. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. An aircraft measurement technique for formaldehyde and soluble carbonyl compounds

    NASA Astrophysics Data System (ADS)

    Lee, Yin-Nan; Zhou, Xianliang; Leaitch, W. Richard; Banic, Catharine M.

    1996-12-01

    An aircraft technique was developed for measuring ambient concentrations of formaldehyde and a number of soluble carbonyl compounds, including glycolaldehyde, glyoxal, methylglyoxal, glyoxylic acid, and pyruvic acid. Sampling was achieved by liquid scrubbing using a glass coil scrubber in conjunction with an autosampler which collected 5-min integrated liquid samples in septum-sealed vials. Analysis was performed on the ground after flight using high-performance liquid chromatography following derivatization of the carbonyl analytes with 2,4-dinitrophenylhydrazine; the limit of detection was 0.01 to 0.02 parts per billion by volume (ppbv) in the gas phase. Although lacking a real-time capability, this technique offers the advantage of simultaneously measuring six carbonyl compounds, savings in space and power on the aircraft, and a dependable ground-based analysis. This technique was deployed on the Canadian National Research Council DHC-6 Twin Otter during the 1993 summer intensive of the North Atlantic Regional Experiment. The data obtained on August 28, 1993, during a pollutant transport episode are presented as an example of the performance and capability of this technique.

  20. Quantitative CT: technique dependence of volume estimation on pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2012-03-01

    Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.

  1. Nonlinear Earthquake Analysis of Reinforced Concrete Frames with Fiber and Bernoulli-Euler Beam-Column Element

    PubMed Central

    Karaton, Muhammet

    2014-01-01

    A beam-column element based on the Euler-Bernoulli beam theory is researched for nonlinear dynamic analysis of reinforced concrete (RC) structural element. Stiffness matrix of this element is obtained by using rigidity method. A solution technique that included nonlinear dynamic substructure procedure is developed for dynamic analyses of RC frames. A predicted-corrected form of the Bossak-α method is applied for dynamic integration scheme. A comparison of experimental data of a RC column element with numerical results, obtained from proposed solution technique, is studied for verification the numerical solutions. Furthermore, nonlinear cyclic analysis results of a portal reinforced concrete frame are achieved for comparing the proposed solution technique with Fibre element, based on flexibility method. However, seismic damage analyses of an 8-story RC frame structure with soft-story are investigated for cases of lumped/distributed mass and load. Damage region, propagation, and intensities according to both approaches are researched. PMID:24578667

  2. Failure detection and fault management techniques for flush airdata sensing systems

    NASA Technical Reports Server (NTRS)

    Whitmore, Stephen A.; Moes, Timothy R.; Leondes, Cornelius T.

    1992-01-01

    Methods based on chi-squared analysis are presented for detecting system and individual-port failures in the high-angle-of-attack flush airdata sensing system on the NASA F-18 High Alpha Research Vehicle. The HI-FADS hardware is introduced, and the aerodynamic model describes measured pressure in terms of dynamic pressure, angle of attack, angle of sideslip, and static pressure. Chi-squared analysis is described in the presentation of the concept for failure detection and fault management which includes nominal, iteration, and fault-management modes. A matrix of pressure orifices arranged in concentric circles on the nose of the aircraft indicate the parameters which are applied to the regression algorithms. The sensing techniques are applied to the F-18 flight data, and two examples are given of the computed angle-of-attack time histories. The failure-detection and fault-management techniques permit the matrix to be multiply redundant, and the chi-squared analysis is shown to be useful in the detection of failures.

  3. Associative Memory Synthesis, Performance, Storage Capacity And Updating: New Heteroassociative Memory Results

    NASA Astrophysics Data System (ADS)

    Casasent, David; Telfer, Brian

    1988-02-01

    The storage capacity, noise performance, and synthesis of associative memories for image analysis are considered. Associative memory synthesis is shown to be very similar to that of linear discriminant functions used in pattern recognition. These lead to new associative memories and new associative memory synthesis and recollection vector encodings. Heteroassociative memories are emphasized in this paper, rather than autoassociative memories, since heteroassociative memories provide scene analysis decisions, rather than merely enhanced output images. The analysis of heteroassociative memories has been given little attention. Heteroassociative memory performance and storage capacity are shown to be quite different from those of autoassociative memories, with much more dependence on the recollection vectors used and less dependence on M/N. This allows several different and preferable synthesis techniques to be considered for associative memories. These new associative memory synthesis techniques and new techniques to update associative memories are included. We also introduce a new SNR performance measure that is preferable to conventional noise standard deviation ratios.

  4. Cluster analysis based on dimensional information with applications to feature selection and classification

    NASA Technical Reports Server (NTRS)

    Eigen, D. J.; Fromm, F. R.; Northouse, R. A.

    1974-01-01

    A new clustering algorithm is presented that is based on dimensional information. The algorithm includes an inherent feature selection criterion, which is discussed. Further, a heuristic method for choosing the proper number of intervals for a frequency distribution histogram, a feature necessary for the algorithm, is presented. The algorithm, although usable as a stand-alone clustering technique, is then utilized as a global approximator. Local clustering techniques and configuration of a global-local scheme are discussed, and finally the complete global-local and feature selector configuration is shown in application to a real-time adaptive classification scheme for the analysis of remote sensed multispectral scanner data.

  5. Predicting neuropathic ulceration: analysis of static temperature distributions in thermal images

    NASA Astrophysics Data System (ADS)

    Kaabouch, Naima; Hu, Wen-Chen; Chen, Yi; Anderson, Julie W.; Ames, Forrest; Paulson, Rolf

    2010-11-01

    Foot ulcers affect millions of Americans annually. Conventional methods used to assess skin integrity, including inspection and palpation, may be valuable approaches, but they usually do not detect changes in skin integrity until an ulcer has already developed. We analyze the feasibility of thermal imaging as a technique to assess the integrity of the skin and its many layers. Thermal images are analyzed using an asymmetry analysis, combined with a genetic algorithm, to examine the infrared images for early detection of foot ulcers. Preliminary results show that the proposed technique can reliably and efficiently detect inflammation and hence effectively predict potential ulceration.

  6. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  7. Finite elements: Theory and application

    NASA Technical Reports Server (NTRS)

    Dwoyer, D. L. (Editor); Hussaini, M. Y. (Editor); Voigt, R. G. (Editor)

    1988-01-01

    Recent advances in FEM techniques and applications are discussed in reviews and reports presented at the ICASE/LaRC workshop held in Hampton, VA in July 1986. Topics addressed include FEM approaches for partial differential equations, mixed FEMs, singular FEMs, FEMs for hyperbolic systems, iterative methods for elliptic finite-element equations on general meshes, mathematical aspects of FEMS for incompressible viscous flows, and gradient weighted moving finite elements in two dimensions. Consideration is given to adaptive flux-corrected FEM transport techniques for CFD, mixed and singular finite elements and the field BEM, p and h-p versions of the FEM, transient analysis methods in computational dynamics, and FEMs for integrated flow/thermal/structural analysis.

  8. Analysis and synthesis of distributed-lumped-active networks by digital computer

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The use of digital computational techniques in the analysis and synthesis of DLA (distributed lumped active) networks is considered. This class of networks consists of three distinct types of elements, namely, distributed elements (modeled by partial differential equations), lumped elements (modeled by algebraic relations and ordinary differential equations), and active elements (modeled by algebraic relations). Such a characterization is applicable to a broad class of circuits, especially including those usually referred to as linear integrated circuits, since the fabrication techniques for such circuits readily produce elements which may be modeled as distributed, as well as the more conventional lumped and active ones.

  9. Structural reliability assessment of the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Sharif, A.M.; Preston, R.

    1996-12-31

    Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less

  10. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  11. Statistical innovations in the medical device world sparked by the FDA.

    PubMed

    Campbell, Gregory; Yue, Lilly Q

    2016-01-01

    The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.

  12. A ground-based technique for millimeter wave spectroscopic observations of stratospheric trace constituents

    NASA Technical Reports Server (NTRS)

    Parrish, A.; Dezafra, R. L.; Solomon, P. M.; Barrett, J. W.

    1988-01-01

    Recent concern over possible long term stratospheric changes caused by the introduction of man-made compounds has increased the need for instrumentation that can accurately measure stratospheric minor constituents. The technique of radio spectroscopy at millimeter wavelengths was first used to observe rotational transitions of stratospheric ozone nearly two decades ago, but has not been highly developed until recently. A ground-based observing technique is reported which employs a millimeter-wave superheterodyne receiver and multichannel filter spectrometer for measurements of stratospheric constituents that have peak volume mixing ratios that are less than 10 to the -9th, more than 3 orders of magnitude less than that for ozone. The technique is used for an extensive program of observations of stratospheric chlorine monoxide and also for observations of other stratospheric trace gases such as (O-16)3, vibrationally excited (O-16)3, (O-18)2(O-16), N2O, HO2, and HCN. In the present paper, analysis of the observing technique is given, including the method of calibration and analysis of sources of error. The technique is found to be a reliable means of observing and monitoring important stratospheric trace constituents.

  13. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  14. Potential of far-ultraviolet absorption spectroscopy as a highly sensitive qualitative and quantitative analysis method for polymer films, part I: classification of commercial food wrap films.

    PubMed

    Sato, Harumi; Higashi, Noboru; Ikehata, Akifumi; Koide, Noriko; Ozaki, Yukihiro

    2007-07-01

    The aim of the present study is to propose a totally new technique for the utilization of far-ultraviolet (UV) spectroscopy in polymer thin film analysis. Far-UV spectra in the 120-300 nm region have been measured in situ for six kinds of commercial polymer wrap films by use of a novel type of far-UV spectrometer that does not need vacuum evaporation. These films can be straightforwardly classified into three groups, polyethylene (PE) films, polyvinyl chloride (PVC) films, and polyvinylidene chloride (PVDC) films, by using the raw spectra. The differences in the wavelength of the absorption band due to the sigma-sigma* transition of the C-C bond have been used for the classification of the six kinds of films. Using this method, it was easy to distinguish the three kinds of PE films and to separate the two kinds of PVDC films. Compared with other spectroscopic methods, the advantages of this technique include nondestructive analysis, easy spectral measurement, high sensitivity, and simple spectral analysis. The present study has demonstrated that far-UV spectroscopy is a very promising technique for polymer film analysis.

  15. Sensors for ceramic components in advanced propulsion systems

    NASA Technical Reports Server (NTRS)

    Koller, A. C.; Bennethum, W. H.; Burkholder, S. D.; Brackett, R. R.; Harris, J. P.

    1995-01-01

    This report includes: (1) a survey of the current methods for the measurement of surface temperature of ceramic materials suitable for use as hot section flowpath components in aircraft gas turbine engines; (2) analysis and selection of three sensing techniques with potential to extend surface temperature measurement capability beyond current limits; and (3) design, manufacture, and evaluation of the three selected techniques which include the following: platinum rhodium thin film thermocouple on alumina and mullite substrates; doped silicon carbide thin film thermocouple on silicon carbide, silicon nitride, and aluminum nitride substrates; and long and short wavelength radiation pyrometry on the substrates listed above plus yttria stabilized zirconia. Measurement of surface emittance of these materials at elevated temperature was included as part of this effort.

  16. Carbon footprint estimator, phase II : volume II - technical appendices.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  17. Carbon footprint estimator, phase II : volume I - GASCAP model.

    DOT National Transportation Integrated Search

    2014-03-01

    The GASCAP model was developed to provide a software tool for analysis of the life-cycle GHG : emissions associated with the construction and maintenance of transportation projects. This phase : of development included techniques for estimating emiss...

  18. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  19. Technology in the Assessment of Learning Disability.

    ERIC Educational Resources Information Center

    Bigler, Erin D.; Lajiness-O'Neill, Renee; Howes, Nancy-Louise

    1998-01-01

    Reviews recent neuroradiologic and brain imaging techniques in the assessment of learning disability. Technologies reviewed include computerized tomography; magnetic resonance imaging; electrophysiological and metabolic imaging; computerized electroencepholographic studies of evoked potentials, event-related potentials, spectral analysis, and…

  20. Spectral analysis of groove spacing on Ganymede

    NASA Technical Reports Server (NTRS)

    Grimm, R. E.

    1984-01-01

    The technique used to analyze groove spacing on Ganymede is presented. Data from Voyager images are used determine the surface topography and position of the grooves. Power spectal estimates are statistically analyzed and sample data is included.

Top