Multidisciplinary design optimization using multiobjective formulation techniques
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Pagaldipti, Narayanan S.
1995-01-01
This report addresses the development of a multidisciplinary optimization procedure using an efficient semi-analytical sensitivity analysis technique and multilevel decomposition for the design of aerospace vehicles. A semi-analytical sensitivity analysis procedure is developed for calculating computational grid sensitivities and aerodynamic design sensitivities. Accuracy and efficiency of the sensitivity analysis procedure is established through comparison of the results with those obtained using a finite difference technique. The developed sensitivity analysis technique are then used within a multidisciplinary optimization procedure for designing aerospace vehicles. The optimization problem, with the integration of aerodynamics and structures, is decomposed into two levels. Optimization is performed for improved aerodynamic performance at the first level and improved structural performance at the second level. Aerodynamic analysis is performed by solving the three-dimensional parabolized Navier Stokes equations. A nonlinear programming technique and an approximate analysis procedure are used for optimization. The proceduredeveloped is applied to design the wing of a high speed aircraft. Results obtained show significant improvements in the aircraft aerodynamic and structural performance when compared to a reference or baseline configuration. The use of the semi-analytical sensitivity technique provides significant computational savings.
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara
2016-05-04
Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
ERIC Educational Resources Information Center
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-01-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-12
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.
Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong
2016-01-01
The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839
NASA Astrophysics Data System (ADS)
Kozikowski, Raymond T.; Smith, Sarah E.; Lee, Jennifer A.; Castleman, William L.; Sorg, Brian S.; Hahn, David W.
2012-06-01
Fluorescence spectroscopy has been widely investigated as a technique for identifying pathological tissue; however, unrelated subject-to-subject variations in spectra complicate data analysis and interpretation. We describe and evaluate a new biosensing technique, differential laser-induced perturbation spectroscopy (DLIPS), based on deep ultraviolet (UV) photochemical perturbation in combination with difference spectroscopy. This technique combines sequential fluorescence probing (pre- and post-perturbation) with sub-ablative UV perturbation and difference spectroscopy to provide a new spectral dimension, facilitating two improvements over fluorescence spectroscopy. First, the differential technique eliminates significant variations in absolute fluorescence response within subject populations. Second, UV perturbations alter the extracellular matrix (ECM), directly coupling the DLIPS response to the biological structure. Improved biosensing with DLIPS is demonstrated in vivo in a murine model of chemically induced skin lesion development. Component loading analysis of the data indicates that the DLIPS technique couples to structural proteins in the ECM. Analysis of variance shows that DLIPS has a significant response to emerging pathology as opposed to other population differences. An optimal likelihood ratio classifier for the DLIPS dataset shows that this technique holds promise for improved diagnosis of epithelial pathology. Results further indicate that DLIPS may improve diagnosis of tissue by augmenting fluorescence spectra (i.e. orthogonal sensing).
Steam generator tubing NDE performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, G.; Welty, C.S. Jr.
1997-02-01
Steam generator (SG) non-destructive examination (NDE) is a fundamental element in the broader SG in-service inspection (ISI) process, a cornerstone in the management of PWR steam generators. Based on objective performance measures (tube leak forced outages and SG-related capacity factor loss), ISI performance has shown a continually improving trend over the years. Performance of the NDE element is a function of the fundamental capability of the technique, and the ability of the analysis portion of the process in field implementation of the technique. The technology continues to improve in several areas, e.g. system sensitivity, data collection rates, probe/coil design, andmore » data analysis software. With these improvements comes the attendant requirement for qualification of the technique on the damage form(s) to which it will be applied, and for training and qualification of the data analysis element of the ISI process on the field implementation of the technique. The introduction of data transfer via fiber optic line allows for remote data acquisition and analysis, thus improving the efficiency of analysis for a limited pool of data analysts. This paper provides an overview of the current status of SG NDE, and identifies several important issues to be addressed.« less
Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
Applying GRA and QFD to Improve Library Service Quality
ERIC Educational Resources Information Center
Chen, Yen-Ting; Chou, Tsung-Yu
2011-01-01
This paper applied Grey Relational Analysis (GRA) to Quality Function Deployment (QFD) to identify service improvement techniques for an academic library. First, reader needs and their importance, and satisfaction degrees were examined via questionnaires. Second, the service improvement techniques for satisfying the reader needs were developed by…
Micropowder collecting technique for stable isotope analysis of carbonates.
Sakai, Saburo; Kodan, Tsuyoshi
2011-05-15
Micromilling is a conventional technique used in the analysis of the isotopic composition of geological materials, which improves the spatial resolution of sample collection for analysis. However, a problem still remains concerning the recovery ratio of the milled sample. We constructed a simple apparatus consisting of a vacuum pump, a sintered metal filter, electrically conductive rubber stopper and a stainless steel tube for transferring the milled powder into a reaction vial. In our preliminary experiments on carbonate powder, we achieved a rapid recovery of 5 to 100 µg of carbonate with a high recovery ratio (>90%). This technique shortens the sample preparation time, improves the recovery ratio, and homogenizes the sample quantity, which, in turn, improves the analytical reproducibility. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Predicting Effective Course Conduction Strategy Using Datamining Techniques
ERIC Educational Resources Information Center
Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.
2017-01-01
Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Advances in the analysis and design of constant-torque springs
NASA Technical Reports Server (NTRS)
McGuire, John R.; Yura, Joseph A.
1996-01-01
In order to improve the design procedure of constant-torque springs used in aerospace applications, several new analysis techniques have been developed. These techniques make it possible to accurately construct a torque-rotation curve for any general constant-torque spring configuration. These new techniques allow for friction in the system to be included in the analysis, an area of analysis that has heretofore been unexplored. The new analysis techniques also include solutions for the deflected shape of the spring as well as solutions for drum and roller support reaction forces. A design procedure incorporating these new capabilities is presented.
An improved switching converter model using discrete and average techniques
NASA Technical Reports Server (NTRS)
Shortt, D. J.; Lee, F. C.
1982-01-01
The nonlinear modeling and analysis of dc-dc converters has been done by averaging and discrete-sampling techniques. The averaging technique is simple, but inaccurate as the modulation frequencies approach the theoretical limit of one-half the switching frequency. The discrete technique is accurate even at high frequencies, but is very complex and cumbersome. An improved model is developed by combining the aforementioned techniques. This new model is easy to implement in circuit and state variable forms and is accurate to the theoretical limit.
Statistical Evaluation of Time Series Analysis Techniques
NASA Technical Reports Server (NTRS)
Benignus, V. A.
1973-01-01
The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.
STATISTICAL SAMPLING AND DATA ANALYSIS
Research is being conducted to develop approaches to improve soil and sediment sampling techniques, measurement design and geostatistics, and data analysis via chemometric, environmetric, and robust statistical methods. Improvements in sampling contaminated soil and other hetero...
Phospholipid Fatty Acid Analysis: Past, Present and Future
NASA Astrophysics Data System (ADS)
Findlay, R. H.
2008-12-01
With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.
Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques
2018-04-30
Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice
Development of Multiobjective Optimization Techniques for Sonic Boom Minimization
NASA Technical Reports Server (NTRS)
Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.
1996-01-01
A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.
Evidential Reasoning in Expert Systems for Image Analysis.
1985-02-01
techniques to image analysis (IA). There is growing evidence that these techniques offer significant improvements in image analysis , particularly in the...2) to provide a common framework for analysis, (3) to structure the ER process for major expert-system tasks in image analysis , and (4) to identify...approaches to three important tasks for expert systems in the domain of image analysis . This segment concluded with an assessment of the strengths
Tactics, Methods and Techniques to Improve Special Forces In-Service Enlisted Recruiting
2002-06-01
Qualifications.........................................................22 3. SWOT Analysis...26 3. SWOT Analysis...31 3. SWOT Analysis ..................................................................................31 viii a. Strengths
Systems design analysis applied to launch vehicle configuration
NASA Technical Reports Server (NTRS)
Ryan, R.; Verderaime, V.
1993-01-01
As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.
The use of artificial intelligence techniques to improve the multiple payload integration process
NASA Technical Reports Server (NTRS)
Cutts, Dannie E.; Widgren, Brian K.
1992-01-01
A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.
The physical and empirical basis for a specific clear-air turbulence risk index
NASA Technical Reports Server (NTRS)
Keller, J. L.
1985-01-01
An improved operational CAT detection and forecasting technique is developed and detailed. This technique is the specific clear air turbulence risk (SCATR) index. This index shows some promising results. The improvements seen using hand analyzed data, as a result of the more realistic representation of the vertical shear of the horizontal wind, are also realized in the data analysis used in the PROFS/CWP application. The SCATR index should improve as database enhancements such as profiler and VAS satellite data, which increase the resolution in space and time, are brought into even more sophisticated objective analysis schemes.
A comparative analysis of frequency modulation threshold extension techniques
NASA Technical Reports Server (NTRS)
Arndt, G. D.; Loch, F. J.
1970-01-01
FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output
McCormick, Frank; Gupta, Anil; Bruce, Ben; Harris, Josh; Abrams, Geoff; Wilson, Hillary; Hussey, Kristen; Cole, Brian J.
2014-01-01
Purpose: The purpose of this study was to measure and compare the subjective, objective, and radiographic healing outcomes of single-row (SR), double-row (DR), and transosseous equivalent (TOE) suture techniques for arthroscopic rotator cuff repair. Materials and Methods: A retrospective comparative analysis of arthroscopic rotator cuff repairs by one surgeon from 2004 to 2010 at minimum 2-year followup was performed. Cohorts were matched for age, sex, and tear size. Subjective outcome variables included ASES, Constant, SST, UCLA, and SF-12 scores. Objective outcome variables included strength, active range of motion (ROM). Radiographic healing was assessed by magnetic resonance imaging (MRI). Statistical analysis was performed using analysis of variance (ANOVA), Mann — Whitney and Kruskal — Wallis tests with significance, and the Fisher exact probability test <0.05. Results: Sixty-three patients completed the study requirements (20 SR, 21 DR, 22 TOE). There was a clinically and statistically significant improvement in outcomes with all repair techniques (ASES mean improvement P = <0.0001). The mean final ASES scores were: SR 83; (SD 21.4); DR 87 (SD 18.2); TOE 87 (SD 13.2); (P = 0.73). There was a statistically significant improvement in strength for each repair technique (P < 0.001). There was no significant difference between techniques across all secondary outcome assessments: ASES improvement, Constant, SST, UCLA, SF-12, ROM, Strength, and MRI re-tear rates. There was a decrease in re-tear rates from single row (22%) to double-row (18%) to transosseous equivalent (11%); however, this difference was not statistically significant (P = 0.6). Conclusions: Compared to preoperatively, arthroscopic rotator cuff repair, using SR, DR, or TOE techniques, yielded a clinically and statistically significant improvement in subjective and objective outcomes at a minimum 2-year follow-up. Level of Evidence: Therapeutic level 3. PMID:24926159
Module Degradation Mechanisms Studied by a Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnston, Steve; Al-Jassim, Mowafak; Hacke, Peter
2016-11-21
A key pathway to meeting the Department of Energy SunShot 2020 goals is to reduce financing costs by improving investor confidence through improved photovoltaic (PV) module reliability. A comprehensive approach to further understand and improve PV reliability includes characterization techniques and modeling from module to atomic scale. Imaging techniques, which include photoluminescence, electroluminescence, and lock-in thermography, are used to locate localized defects responsible for module degradation. Small area samples containing such defects are prepared using coring techniques and are then suitable and available for microscopic study and specific defect modeling and analysis.
Improved Cloud and Snow Screening in MAIAC Aerosol Retrievals Using Spectral and Spatial Analysis
NASA Technical Reports Server (NTRS)
Lyapustin, A.; Wang, Y.; Laszlo, I.; Kokrkin, S.
2012-01-01
An improved cloud/snow screening technique in the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm is described. It is implemented as part of MAIAC aerosol retrievals based on analysis of spectral residuals and spatial variability. Comparisons with AERONET aerosol observations and a large-scale MODIS data analysis show strong suppression of aerosol optical thickness outliers due to unresolved clouds and snow. At the same time, the developed filter does not reduce the aerosol retrieval capability at high 1 km resolution in strongly inhomogeneous environments, such as near centers of the active fires. Despite significant improvement, the optical depth outliers in high spatial resolution data are and will remain the problem to be addressed by the application-dependent specialized filtering techniques.
Tactics, Methods and Techniques to Improve Special Forces In-Service Enlisted Recruiting
2002-06-01
22 3. SWOT Analysis ..................................................................................23 a. Strengths...26 3. SWOT Analysis ..................................................................................27 a. Strengths...31 2. Constraints..........................................................................................31 3. SWOT Analysis
Advanced statistical methods for improved data analysis of NASA astrophysics missions
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.
1992-01-01
The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.
NASA Astrophysics Data System (ADS)
Al-Saggaf, Yeslam; Burmeister, Oliver K.
2012-09-01
This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.
NASA Technical Reports Server (NTRS)
1994-01-01
This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.
Aerodynamics of a linear oscillating cascade
NASA Technical Reports Server (NTRS)
Buffum, Daniel H.; Fleeter, Sanford
1990-01-01
The steady and unsteady aerodynamics of a linear oscillating cascade are investigated using experimental and computational methods. Experiments are performed to quantify the torsion mode oscillating cascade aerodynamics of the NASA Lewis Transonic Oscillating Cascade for subsonic inlet flowfields using two methods: simultaneous oscillation of all the cascaded airfoils at various values of interblade phase angle, and the unsteady aerodynamic influence coefficient technique. Analysis of these data and correlation with classical linearized unsteady aerodynamic analysis predictions indicate that the wind tunnel walls enclosing the cascade have, in some cases, a detrimental effect on the cascade unsteady aerodynamics. An Euler code for oscillating cascade aerodynamics is modified to incorporate improved upstream and downstream boundary conditions and also the unsteady aerodynamic influence coefficient technique. The new boundary conditions are shown to improve the unsteady aerodynamic influence coefficient technique. The new boundary conditions are shown to improve the unsteady aerodynamic predictions of the code, and the computational unsteady aerodynamic influence coefficient technique is shown to be a viable alternative for calculation of oscillating cascade aerodynamics.
An improved technique for the 2H/1H analysis of urines from diabetic volunteers
Coplen, T.B.; Harper, I.T.
1994-01-01
The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, ~ 1-2???, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.The H2-H2O ambient-temperature equilibration technique for the determination of 2H/1H ratios in urinary waters from diabetic subjects provides improved accuracy over the conventional Zn reduction technique. The standard deviation, approximately 1-2%, is at least a factor of three better than that of the Zn reduction technique on urinary waters from diabetic volunteers. Experiments with pure water and solutions containing glucose, urea and albumen indicate that there is no measurable bias in the hydrogen equilibration technique.
A Bayesian technique for improving the sensitivity of the atmospheric neutrino L/E analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, A. S. T.; Chapman, J. D.; Thomson, M. A.
Tmore » his paper outlines a method for improving the precision of atmospheric neutrino oscillation measurements. One experimental signature for these oscillations is an observed deficit in the rate of ν μ charged-current interactions with an oscillatory dependence on L ν / E ν , where L ν is the neutrino propagation distance and E mrow is="true"> ν is the neutrino energy. For contained-vertex atmospheric neutrino interactions, the L ν / E ν resolution varies significantly from event to event. he precision of the oscillation measurement can be improved by incorporating information on L ν / E ν resolution into the oscillation analysis. In the analysis presented here, a Bayesian technique is used to estimate the L ν / E ν resolution of observed atmospheric neutrinos on an event-by-event basis. By separating the events into bins of L ν / E ν resolution in the oscillation analysis, a significant improvement in oscillation sensitivity can be achieved.« less
Multiscale Analysis of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C. A.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is that there is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also cursed us with an increased amount of higher complexity data than previous missions. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present a preliminary analysis of multiscale techniques applied to solar image data. Specifically, we explore the use of the 2-d wavelet transform and related transforms with EIT, LASCO and TRACE images. This work was supported by NASA contract NAS5-00220.
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
Increasing Public Library Productivity.
ERIC Educational Resources Information Center
Samuelson, Howard
1981-01-01
Suggests ways of improving productivity for public libraries faced with increased accountability, dwindling revenues, and continuing inflation. Techniques described include work simplification, work analysis, improved management, and employee motivation. (RAA)
Zakaria, Ammar; Shakaff, Ali Yeon Md.; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2010-01-01
An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together. PMID:22163381
Zakaria, Ammar; Shakaff, Ali Yeon Md; Adom, Abdul Hamid; Ahmad, Mohd Noor; Masnan, Maz Jamilah; Aziz, Abdul Hallis Abdul; Fikri, Nazifah Ahmad; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2010-01-01
An improved classification of Orthosiphon stamineus using a data fusion technique is presented. Five different commercial sources along with freshly prepared samples were discriminated using an electronic nose (e-nose) and an electronic tongue (e-tongue). Samples from the different commercial brands were evaluated by the e-tongue and then followed by the e-nose. Applying Principal Component Analysis (PCA) separately on the respective e-tongue and e-nose data, only five distinct groups were projected. However, by employing a low level data fusion technique, six distinct groupings were achieved. Hence, this technique can enhance the ability of PCA to analyze the complex samples of Orthosiphon stamineus. Linear Discriminant Analysis (LDA) was then used to further validate and classify the samples. It was found that the LDA performance was also improved when the responses from the e-nose and e-tongue were fused together.
Ozone measurement systems improvements studies
NASA Technical Reports Server (NTRS)
Thomas, R. W.; Guard, K.; Holland, A. C.; Spurling, J. F.
1974-01-01
Results are summarized of an initial study of techniques for measuring atmospheric ozone, carried out as the first phase of a program to improve ozone measurement techniques. The study concentrated on two measurement systems, the electro chemical cell (ECC) ozonesonde and the Dobson ozone spectrophotometer, and consisted of two tasks. The first task consisted of error modeling and system error analysis of the two measurement systems. Under the second task a Monte-Carlo model of the Dobson ozone measurement technique was developed and programmed for computer operation.
Photographic and photometric enhancement of Lunar Orbiter products, projects A, B and C
NASA Technical Reports Server (NTRS)
1972-01-01
A detailed discussion is presented of the framelet joining, photometric data improvement, and statistical error analysis. The Lunar Orbiter film handling system, readout system, and the digitization are described, along with the technique of joining adjacent framelets by a using a digital computer. Time and cost estimates are given. The problems and techniques involved in improving the digitized data are discussed. It was found that spectacular improvements are possible. Program documentations are included.
Wind profiler signal detection improvements
NASA Technical Reports Server (NTRS)
Hart, G. F.; Divis, Dale H.
1992-01-01
Research is described on potential improvements to the software used with the NASA 49.25 MHz wind profiler located at Kennedy Space Center. In particular, the analysis and results are provided of a study to (1) identify preferred mathematical techniques for the detection of atmospheric signals that provide wind velocities which are obscured by natural and man-made sources, and (2) to analyze one or more preferred techniques to demonstrate proof of the capability to improve the detection of wind velocities.
Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.
Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V
2007-01-01
The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.
Dictionary-based image reconstruction for superresolution in integrated circuit imaging.
Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim
2015-06-01
Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.
Hybrid soft computing systems for electromyographic signals analysis: a review.
Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates
2014-02-03
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.
Hybrid soft computing systems for electromyographic signals analysis: a review
2014-01-01
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Oud, Bart; Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T
2012-01-01
Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. PMID:22152095
Oud, Bart; van Maris, Antonius J A; Daran, Jean-Marc; Pronk, Jack T
2012-03-01
Successful reverse engineering of mutants that have been obtained by nontargeted strain improvement has long presented a major challenge in yeast biotechnology. This paper reviews the use of genome-wide approaches for analysis of Saccharomyces cerevisiae strains originating from evolutionary engineering or random mutagenesis. On the basis of an evaluation of the strengths and weaknesses of different methods, we conclude that for the initial identification of relevant genetic changes, whole genome sequencing is superior to other analytical techniques, such as transcriptome, metabolome, proteome, or array-based genome analysis. Key advantages of this technique over gene expression analysis include the independency of genome sequences on experimental context and the possibility to directly and precisely reproduce the identified changes in naive strains. The predictive value of genome-wide analysis of strains with industrially relevant characteristics can be further improved by classical genetics or simultaneous analysis of strains derived from parallel, independent strain improvement lineages. © 2011 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.
Energy resolution improvement of CdTe detectors by using the principal component analysis technique
NASA Astrophysics Data System (ADS)
Alharbi, T.
2018-02-01
In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.
Improvement of finite element meshes - Heat transfer in an infinite cylinder
NASA Technical Reports Server (NTRS)
Kittur, Madan G.; Huston, Ronald L.; Oswald, Fred B.
1989-01-01
An extension of a structural finite element mesh improvement technique to heat conduction analysis is presented. The mesh improvement concept was originally presented by Prager in studying tapered, axially loaded bars. It was further shown that an improved mesh can be obtained by minimizing the trace of the stiffnes matrix. These procedures are extended and applied to the analysis of heat conduction in an infinitely long hollow circular cylinder.
Improvement in finite element meshes: Heat transfer in an infinite cylinder
NASA Technical Reports Server (NTRS)
Kittur, Madan G.; Huston, Ronald L.; Oswald, Fred B.
1988-01-01
An extension of a structural finite element mesh improvement technique to heat conduction analysis is presented. The mesh improvement concept was originally presented by Prager in studying tapered, axially loaded bars. It was further shown that an improved mesh can be obtained by minimizing the trace of the stiffness matrix. These procedures are extended and applied to the analysis of heat conduction in an infinitely long hollow circular cylinder.
Rodríguez, M T Torres; Andrade, L Cristóbal; Bugallo, P M Bello; Long, J J Casares
2011-09-15
Life cycle thinking (LCT) is one of the philosophies that has recently appeared in the context of the sustainable development. Some of the already existing tools and methods, as well as some of the recently emerged ones, which seek to understand, interpret and design the life of a product, can be included into the scope of the LCT philosophy. That is the case of the material and energy flow analysis (MEFA), a tool derived from the industrial metabolism definition. This paper proposes a methodology combining MEFA with another technique derived from sustainable development which also fits the LCT philosophy, the BAT (best available techniques) analysis. This methodology, applied to an industrial process, seeks to identify the so-called improvable flows by MEFA, so that the appropriate candidate BAT can be selected by BAT analysis. Material and energy inputs, outputs and internal flows are quantified, and sustainable solutions are provided on the basis of industrial metabolism. The methodology has been applied to an exemplary roof tile manufacture plant for validation. 14 Improvable flows have been identified and 7 candidate BAT have been proposed aiming to reduce these flows. The proposed methodology provides a way to detect improvable material or energy flows in a process and selects the most sustainable options to enhance them. Solutions are proposed for the detected improvable flows, taking into account their effectiveness on improving such flows. Copyright © 2011 Elsevier B.V. All rights reserved.
Static analysis of class invariants in Java programs
NASA Astrophysics Data System (ADS)
Bonilla-Quintero, Lidia Dionisia
2011-12-01
This paper presents a technique for the automatic inference of class invariants from Java bytecode. Class invariants are very important for both compiler optimization and as an aid to programmers in their efforts to reduce the number of software defects. We present the original DC-invariant analysis from Adam Webber, talk about its shortcomings and suggest several different ways to improve it. To apply the DC-invariant analysis to identify DC-invariant assertions, all that one needs is a monotonic method analysis function and a suitable assertion domain. The DC-invariant algorithm is very general; however, the method analysis can be highly tuned to the problem in hand. For example, one could choose shape analysis as the method analysis function and use the DC-invariant analysis to simply extend it to an analysis that would yield class-wide invariants describing the shapes of linked data structures. We have a prototype implementation: a system we refer to as "the analyzer" that infers DC-invariant unary and binary relations and provides them to the user in a human readable format. The analyzer uses those relations to identify unnecessary array bounds checks in Java programs and perform null-reference analysis. It uses Adam Webber's relational constraint technique for the class-invariant binary relations. Early results with the analyzer were very imprecise in the presence of "dirty-called" methods. A dirty-called method is one that is called, either directly or transitively, from any constructor of the class, or from any method of the class at a point at which a disciplined field has been altered. This result was unexpected and forced an extensive search for improved techniques. An important contribution of this paper is the suggestion of several ways to improve the results by changing the way dirty-called methods are handled. The new techniques expand the set of class invariants that can be inferred over Webber's original results. The technique that produces better results uses in-line analysis. Final results are promising: we can infer sound class invariants for full-scale, not just toy applications.
NASA Astrophysics Data System (ADS)
Ruiz-Cárcel, C.; Jaramillo, V. H.; Mba, D.; Ottewill, J. R.; Cao, Y.
2016-01-01
The detection and diagnosis of faults in industrial processes is a very active field of research due to the reduction in maintenance costs achieved by the implementation of process monitoring algorithms such as Principal Component Analysis, Partial Least Squares or more recently Canonical Variate Analysis (CVA). Typically the condition of rotating machinery is monitored separately using vibration analysis or other specific techniques. Conventional vibration-based condition monitoring techniques are based on the tracking of key features observed in the measured signal. Typically steady-state loading conditions are required to ensure consistency between measurements. In this paper, a technique based on merging process and vibration data is proposed with the objective of improving the detection of mechanical faults in industrial systems working under variable operating conditions. The capabilities of CVA for detection and diagnosis of faults were tested using experimental data acquired from a compressor test rig where different process faults were introduced. Results suggest that the combination of process and vibration data can effectively improve the detectability of mechanical faults in systems working under variable operating conditions.
NASA Technical Reports Server (NTRS)
Lindstrom, David J.; Lindstrom, Richard M.
1989-01-01
Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.
Multiscale Image Processing of Solar Image Data
NASA Astrophysics Data System (ADS)
Young, C.; Myers, D. C.
2001-12-01
It is often said that the blessing and curse of solar physics is too much data. Solar missions such as Yohkoh, SOHO and TRACE have shown us the Sun with amazing clarity but have also increased the amount of highly complex data. We have improved our view of the Sun yet we have not improved our analysis techniques. The standard techniques used for analysis of solar images generally consist of observing the evolution of features in a sequence of byte scaled images or a sequence of byte scaled difference images. The determination of features and structures in the images are done qualitatively by the observer. There is little quantitative and objective analysis done with these images. Many advances in image processing techniques have occured in the past decade. Many of these methods are possibly suited for solar image analysis. Multiscale/Multiresolution methods are perhaps the most promising. These methods have been used to formulate the human ability to view and comprehend phenomena on different scales. So these techniques could be used to quantitify the imaging processing done by the observers eyes and brains. In this work we present several applications of multiscale techniques applied to solar image data. Specifically, we discuss uses of the wavelet, curvelet, and related transforms to define a multiresolution support for EIT, LASCO and TRACE images.
NASA Technical Reports Server (NTRS)
Gordon, David; Ma, Chopo; MacMillan, Dan; Petrov, Leonid; Baver, Karen
2005-01-01
This report presents the activities of the GSFC VLBI Analysis Center during 2004. The GSFC Analysis Center analyzes all IVS sessions, makes regular IVS submissions of data and analysis products, and performs research and software development activities aimed at improving the VLBI technique.
NASA Technical Reports Server (NTRS)
Gordon, David; Ma, Chopo; MacMillan, Dan; Gipson, John; Bolotin, Sergei; Le Bail, Karine; Baver, Karen
2013-01-01
This report presents the activities of the GSFC VLBI Analysis Center during 2012. The GSFC VLBI Analysis Center analyzes all IVS sessions, makes regular IVS submissions of data and analysis products, and performs research and software development aimed at improving the VLBI technique.
Methods for Improving Information from ’Undesigned’ Human Factors Experiments.
Human factors engineering, Information processing, Regression analysis , Experimental design, Least squares method, Analysis of variance, Correlation techniques, Matrices(Mathematics), Multiple disciplines, Mathematical prediction
Method for improving accuracy in full evaporation headspace analysis.
Xie, Wei-Qi; Chai, Xin-Sheng
2017-05-01
We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
2004-12-03
other process improvements could also enhance DoD data practices. These include the incorporation of library science techniques as well as processes to...coalition communities as well as adapting the approaches and lessons of the library science community. Second, there is a need to generate a plan of...Best Practices (2 of 2) - Processes - Incorporate library science techniques in repository design - Improve visibility and accessibility of DoD data
An improved infrared technique for sorting pecans
NASA Astrophysics Data System (ADS)
Graeve, Thorsten; Dereniak, Eustace L.; Lamonica, John A., Jr.
1991-10-01
This paper presents the results of a study of pecan spectral reflectances. It describes an experiment for measuring the contrast between several components of raw pecan product to be sorted. An analysis of the experimental data reveals high contrast ratios in the infrared spectrum, suggesting a potential improvement in sorting efficiency when separating pecan meat from shells. It is believed that this technique has the potential to dramatically improve the efficiency of current sorting machinery, and to reduce the cost of processing pecans for the consumer market.
Man-machine analysis of translation and work tasks of Skylab films
NASA Technical Reports Server (NTRS)
Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.
1979-01-01
An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.
Improved Technique for Finding Vibration Parameters
NASA Technical Reports Server (NTRS)
Andrew, L. V.; Park, C. C.
1986-01-01
Filtering and sample manipulation reduce noise effects. Analysis technique improves extraction of vibrational frequencies and damping rates from measurements of vibrations of complicated structure. Structural vibrations measured by accelerometers. Outputs digitized at frequency high enough to cover all modes of interest. Use of method on set of vibrational measurements from Space Shuttle, raised level of coherence from previous values below 50 percent to values between 90 and 99 percent
Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek
2017-07-04
One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.
TOPICAL REVIEW: Human soft tissue analysis using x-ray or gamma-ray techniques
NASA Astrophysics Data System (ADS)
Theodorakou, C.; Farquharson, M. J.
2008-06-01
This topical review is intended to describe the x-ray techniques used for human soft tissue analysis. X-ray techniques have been applied to human soft tissue characterization and interesting results have been presented over the last few decades. The motivation behind such studies is to provide improved patient outcome by using the data obtained to better understand a disease process and improve diagnosis. An overview of theoretical background as well as a complete set of references is presented. For each study, a brief summary of the methodology and results is given. The x-ray techniques include x-ray diffraction, x-ray fluorescence, Compton scattering, Compton to coherent scattering ratio and attenuation measurements. The soft tissues that have been classified using x-rays or gamma rays include brain, breast, colon, fat, kidney, liver, lung, muscle, prostate, skin, thyroid and uterus.
An automatic step adjustment method for average power analysis technique used in fiber amplifiers
NASA Astrophysics Data System (ADS)
Liu, Xue-Ming
2006-04-01
An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.
Recent Advances in Techniques for Starch Esters and the Applications: A Review
Hong, Jing; Zeng, Xin-An; Brennan, Charles S.; Brennan, Margaret; Han, Zhong
2016-01-01
Esterification is one of the most important methods to alter the structure of starch granules and improve its applications. Conventionally, starch esters are prepared by conventional or dual modification techniques, which have the disadvantages of being expensive, have regent overdoses, and are time-consuming. In addition, the degree of substitution (DS) is often considered as the primary factor in view of its contribution to estimate substituted groups of starch esters. In order to improve the detection accuracy and production efficiency, different detection techniques, including titration, nuclear magnetic resonance (NMR), Fourier transform infrared spectroscopy (FT-IR), thermal gravimetric analysis/infrared spectroscopy (TGA/IR) and headspace gas chromatography (HS-GC), have been developed for DS. This paper gives a comprehensive overview on the recent advances in DS analysis and starch esterification techniques. Additionally, the advantages, limitations, some perspectives on future trends of these techniques and the applications of their derivatives in the food industry are also presented. PMID:28231145
Beck-Fruchter, Ronit; Shalev, Eliezer; Weiss, Amir
2016-03-01
The human oocyte is surrounded by hyaluronic acid, which acts as a natural selector of spermatozoa. Human sperm that express hyaluronic acid receptors and bind to hyaluronic acid have normal shape, minimal DNA fragmentation and low frequency of chromosomal aneuploidies. Use of hyaluronic acid binding assays in intracytoplasmic sperm injection (ICSI) cycles to improve clinical outcomes has been studied, although none of these studies had sufficient statistical power. In this systematic review and meta-analysis, electronic databases were searched up to June 2015 to identify studies of ICSI cycles in which spermatozoa able to bind hyaluronic acid was selected. The main outcomes were fertilization rate and clinical pregnancy rate. Secondary outcomes included cleavage rate, embryo quality, implantation rate, spontaneous abortion and live birth rate. Seven studies and 1437 cycles were included. Use of hyaluronic acid binding sperm selection technique yielded no improvement in fertilization and pregnancy rates. A meta-analysis of all available studies showed an improvement in embryo quality and implantation rate; an analysis of prospective studies only showed an improvement in embryo quality. Evidence does not support routine use of hyaluronic acid binding assays in all ICSI cycles. Identification of patients that might benefit from this technique needs further study. Copyright © 2015 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertz, P.R.
Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less
The composite sequential clustering technique for analysis of multispectral scanner data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.
Practical semen analysis: from A to Z
Brazil, Charlene
2010-01-01
Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076
Acta Aeronautica et Astronautica Sinica,
1983-07-28
substructural analysis in modal synthesis - two improved substructural assembling techniques 49 9-node quadrilateral isoparametric element 64 Application of laser...Time from Service Data, J. Aircraft, Vol. 15, No. 11, 1978. 48 MULTI-LEVEL SUBSTRUCTURAL ANALYSIS IN MODAL SYNTHESIS -- TWO IMPROVED SUBSTRUCTURAL...34 Modal Synthesis in Structural Dynamic Analysis ," Naching Institute of Aeronautics and Astronautics, 1979. 62a 8. Chang Te-wen, "Free-Interface Modal
A global optimization approach to multi-polarity sentiment analysis.
Li, Xinmiao; Li, Jing; Wu, Yukeng
2015-01-01
Following the rapid development of social media, sentiment analysis has become an important social media mining technique. The performance of automatic sentiment analysis primarily depends on feature selection and sentiment classification. While information gain (IG) and support vector machines (SVM) are two important techniques, few studies have optimized both approaches in sentiment analysis. The effectiveness of applying a global optimization approach to sentiment analysis remains unclear. We propose a global optimization-based sentiment analysis (PSOGO-Senti) approach to improve sentiment analysis with IG for feature selection and SVM as the learning engine. The PSOGO-Senti approach utilizes a particle swarm optimization algorithm to obtain a global optimal combination of feature dimensions and parameters in the SVM. We evaluate the PSOGO-Senti model on two datasets from different fields. The experimental results showed that the PSOGO-Senti model can improve binary and multi-polarity Chinese sentiment analysis. We compared the optimal feature subset selected by PSOGO-Senti with the features in the sentiment dictionary. The results of this comparison indicated that PSOGO-Senti can effectively remove redundant and noisy features and can select a domain-specific feature subset with a higher-explanatory power for a particular sentiment analysis task. The experimental results showed that the PSOGO-Senti approach is effective and robust for sentiment analysis tasks in different domains. By comparing the improvements of two-polarity, three-polarity and five-polarity sentiment analysis results, we found that the five-polarity sentiment analysis delivered the largest improvement. The improvement of the two-polarity sentiment analysis was the smallest. We conclude that the PSOGO-Senti achieves higher improvement for a more complicated sentiment analysis task. We also compared the results of PSOGO-Senti with those of the genetic algorithm (GA) and grid search method. From the results of this comparison, we found that PSOGO-Senti is more suitable for improving a difficult multi-polarity sentiment analysis problem.
Application of neural networks and sensitivity analysis to improved prediction of trauma survival.
Hunter, A; Kennedy, L; Henry, J; Ferguson, I
2000-05-01
The performance of trauma departments is widely audited by applying predictive models that assess probability of survival, and examining the rate of unexpected survivals and deaths. Although the TRISS methodology, a logistic regression modelling technique, is still the de facto standard, it is known that neural network models perform better. A key issue when applying neural network models is the selection of input variables. This paper proposes a novel form of sensitivity analysis, which is simpler to apply than existing techniques, and can be used for both numeric and nominal input variables. The technique is applied to the audit survival problem, and used to analyse the TRISS variables. The conclusions discuss the implications for the design of further improved scoring schemes and predictive models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
Sparse Image Reconstruction on the Sphere: Analysis and Synthesis.
Wallis, Christopher G R; Wiaux, Yves; McEwen, Jason D
2017-11-01
We develop techniques to solve ill-posed inverse problems on the sphere by sparse regularization, exploiting sparsity in both axisymmetric and directional scale-discretized wavelet space. Denoising, inpainting, and deconvolution problems and combinations thereof, are considered as examples. Inverse problems are solved in both the analysis and synthesis settings, with a number of different sampling schemes. The most effective approach is that with the most restricted solution-space, which depends on the interplay between the adopted sampling scheme, the selection of the analysis/synthesis problem, and any weighting of the l 1 norm appearing in the regularization problem. More efficient sampling schemes on the sphere improve reconstruction fidelity by restricting the solution-space and also by improving sparsity in wavelet space. We apply the technique to denoise Planck 353-GHz observations, improving the ability to extract the structure of Galactic dust emission, which is important for studying Galactic magnetism.
Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.
Card, Alan J; Ward, James R; Clarkson, P John
2014-01-01
After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.
Automated Student Model Improvement
ERIC Educational Resources Information Center
Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.
2012-01-01
Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…
Use of Iba Techniques to Characterize High Velocity Thermal Spray Coatings
NASA Astrophysics Data System (ADS)
Trompetter, W.; Markwitz, A.; Hyland, M.
Spray coatings are being used in an increasingly wide range of industries to improve the abrasive, erosive and sliding wear of machine components. Over the past decade industries have moved to the application of supersonic high velocity thermal spray techniques. These coating techniques produce superior coating quality in comparison to other traditional techniques such as plasma spraying. To date the knowledge of the bonding processes and the structure of the particles within thermal spray coatings is very subjective. The aim of this research is to improve our understanding of these materials through the use of IBA techniques in conjunction with other materials analysis techniques. Samples were prepared by spraying a widely used commercial NiCr powder onto substrates using a HVAF (high velocity air fuel) thermal spraying technique. Detailed analysis of the composition and structure of the power particles revealed two distinct types of particles. The majority was NiCr particles with a significant minority of particles composing of SiO2/CrO3. When the particles were investigated both as raw powder and in the sprayed coating, it was surprising to find that the composition of the coating meterial remained unchanged during the coating process despite the high velocity application.
Flood frequency analysis using optimization techniques : final report.
DOT National Transportation Integrated Search
1992-10-01
this study consists of three parts. In the first part, a comprehensive investigation was made to find an improved estimation method for the log-Pearson type 3 (LP3) distribution by using optimization techniques. Ninety sets of observed Louisiana floo...
Matrix Synthesis and Characterization
NASA Technical Reports Server (NTRS)
1984-01-01
The role of NASA in the area of composite material synthesis; evaluation techniques; prediction analysis techniques; solvent-resistant tough composite matrix; resistance to paint strippers; acceptable processing temperature and pressure for thermoplastics; and the role of computer modeling and fiber interface improvement were discussed.
Digital Dental X-ray Database for Caries Screening
NASA Astrophysics Data System (ADS)
Rad, Abdolvahab Ehsani; Rahim, Mohd Shafry Mohd; Rehman, Amjad; Saba, Tanzila
2016-06-01
Standard database is the essential requirement to compare the performance of image analysis techniques. Hence the main issue in dental image analysis is the lack of available image database which is provided in this paper. Periapical dental X-ray images which are suitable for any analysis and approved by many dental experts are collected. This type of dental radiograph imaging is common and inexpensive, which is normally used for dental disease diagnosis and abnormalities detection. Database contains 120 various Periapical X-ray images from top to bottom jaw. Dental digital database is constructed to provide the source for researchers to use and compare the image analysis techniques and improve or manipulate the performance of each technique.
Statistical analysis of RHIC beam position monitors performance
NASA Astrophysics Data System (ADS)
Calaga, R.; Tomás, R.
2004-04-01
A detailed statistical analysis of beam position monitors (BPM) performance at RHIC is a critical factor in improving regular operations and future runs. Robust identification of malfunctioning BPMs plays an important role in any orbit or turn-by-turn analysis. Singular value decomposition and Fourier transform methods, which have evolved as powerful numerical techniques in signal processing, will aid in such identification from BPM data. This is the first attempt at RHIC to use a large set of data to statistically enhance the capability of these two techniques and determine BPM performance. A comparison from run 2003 data shows striking agreement between the two methods and hence can be used to improve BPM functioning at RHIC and possibly other accelerators.
2013-06-01
In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the
Solid oxide fuel cell simulation and design optimization with numerical adjoint techniques
NASA Astrophysics Data System (ADS)
Elliott, Louie C.
This dissertation reports on the application of numerical optimization techniques as applied to fuel cell simulation and design. Due to the "multi-physics" inherent in a fuel cell, which results in a highly coupled and non-linear behavior, an experimental program to analyze and improve the performance of fuel cells is extremely difficult. This program applies new optimization techniques with computational methods from the field of aerospace engineering to the fuel cell design problem. After an overview of fuel cell history, importance, and classification, a mathematical model of solid oxide fuel cells (SOFC) is presented. The governing equations are discretized and solved with computational fluid dynamics (CFD) techniques including unstructured meshes, non-linear solution methods, numerical derivatives with complex variables, and sensitivity analysis with adjoint methods. Following the validation of the fuel cell model in 2-D and 3-D, the results of the sensitivity analysis are presented. The sensitivity derivative for a cost function with respect to a design variable is found with three increasingly sophisticated techniques: finite difference, direct differentiation, and adjoint. A design cycle is performed using a simple optimization method to improve the value of the implemented cost function. The results from this program could improve fuel cell performance and lessen the world's dependence on fossil fuels.
NASA Technical Reports Server (NTRS)
Gaston, S.; Wertheim, M.; Orourke, J. A.
1973-01-01
Summary, consolidation and analysis of specifications, manufacturing process and test controls, and performance results for OAO-2 and OAO-3 lot 20 Amp-Hr sealed nickel cadmium cells and batteries are reported. Correlation of improvements in control requirements with performance is a key feature. Updates for a cell/battery computer model to improve performance prediction capability are included. Applicability of regression analysis computer techniques to relate process controls to performance is checked.
Figure Analysis: A Teaching Technique to Promote Visual Literacy and Active Learning
ERIC Educational Resources Information Center
Wiles, Amy M.
2016-01-01
Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based…
Large space antennas: A systems analysis case history
NASA Technical Reports Server (NTRS)
Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)
1987-01-01
The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.
Review and classification of variability analysis techniques with clinical applications.
Bravi, Andrea; Longtin, André; Seely, Andrew J E
2011-10-10
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.
Review and classification of variability analysis techniques with clinical applications
2011-01-01
Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357
Balbekova, Anna; Lohninger, Hans; van Tilborg, Geralda A F; Dijkhuizen, Rick M; Bonta, Maximilian; Limbeck, Andreas; Lendl, Bernhard; Al-Saad, Khalid A; Ali, Mohamed; Celikic, Minja; Ofner, Johannes
2018-02-01
Microspectroscopic techniques are widely used to complement histological studies. Due to recent developments in the field of chemical imaging, combined chemical analysis has become attractive. This technique facilitates a deepened analysis compared to single techniques or side-by-side analysis. In this study, rat brains harvested one week after induction of photothrombotic stroke were investigated. Adjacent thin cuts from rats' brains were imaged using Fourier transform infrared (FT-IR) microspectroscopy and laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS). The LA-ICP-MS data were normalized using an internal standard (a thin gold layer). The acquired hyperspectral data cubes were fused and subjected to multivariate analysis. Brain regions affected by stroke as well as unaffected gray and white matter were identified and classified using a model based on either partial least squares discriminant analysis (PLS-DA) or random decision forest (RDF) algorithms. The RDF algorithm demonstrated the best results for classification. Improved classification was observed in the case of fused data in comparison to individual data sets (either FT-IR or LA-ICP-MS). Variable importance analysis demonstrated that both molecular and elemental content contribute to the improved RDF classification. Univariate spectral analysis identified biochemical properties of the assigned tissue types. Classification of multisensor hyperspectral data sets using an RDF algorithm allows access to a novel and in-depth understanding of biochemical processes and solid chemical allocation of different brain regions.
NASA Astrophysics Data System (ADS)
Saetchnikov, Vladimir A.; Tcherniavskaia, Elina A.; Saetchnikov, Anton V.; Schweiger, Gustav; Ostendorf, Andreas
2014-05-01
Experimental data on detection and identification of variety of biochemical agents, such as proteins, microelements, antibiotic of different generation etc. in both single and multi component solutions under varied in wide range concentration analyzed on the light scattering parameters of whispering gallery mode optical resonance based sensor are represented. Multiplexing on parameters and components has been realized using developed fluidic sensor cell with fixed in adhesive layer dielectric microspheres and data processing. Biochemical component identification has been performed by developed network analysis techniques. Developed approach is demonstrated to be applicable both for single agent and for multi component biochemical analysis. Novel technique based on optical resonance on microring structures, plasmon resonance and identification tools has been developed. To improve a sensitivity of microring structures microspheres fixed by adhesive had been treated previously by gold nanoparticle solution. Another technique used thin film gold layers deposited on the substrate below adhesive. Both biomolecule and nanoparticle injections caused considerable changes of optical resonance spectra. Plasmonic gold layers under optimized thickness also improve parameters of optical resonance spectra. Biochemical component identification has been also performed by developed network analysis techniques both for single and for multi component solution. So advantages of plasmon enhancing optical microcavity resonance with multiparameter identification tools is used for development of a new platform for ultra sensitive label-free biomedical sensor.
The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre
NASA Astrophysics Data System (ADS)
Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.
2005-09-01
To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.
Evans, Luke; Manley, Kate
2016-06-01
Single-incision laparoscopic surgery represents an evolution of minimally invasive techniques, but has been a controversial development. A cosmetic advantage is stated by many authors, but has not been found to be universally present or even of considerable importance by patients. This systematic review and meta-analysis demonstrates that there is a cosmetic advantage of the technique regardless of the operation type. The treatment effect in terms of cosmetic improvement is of the order of 0.63.
A method for nonlinear exponential regression analysis
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1971-01-01
A computer-oriented technique is presented for performing a nonlinear exponential regression analysis on decay-type experimental data. The technique involves the least squares procedure wherein the nonlinear problem is linearized by expansion in a Taylor series. A linear curve fitting procedure for determining the initial nominal estimates for the unknown exponential model parameters is included as an integral part of the technique. A correction matrix was derived and then applied to the nominal estimate to produce an improved set of model parameters. The solution cycle is repeated until some predetermined criterion is satisfied.
Coplen, T.B.; Wildman, J.D.; Chen, J.
1991-01-01
Improved precision in the H2-H2O equilibration method for ??D analysis has been achieved in an automated system. Reduction in 1-?? standard deviation of a single mass-spectrometer analysis to 1.3??? is achieved by (1) bonding catalyst to glass rods and assigning use to specific equilibration chambers to monitor performance of catalyst, (2) improving the apparatus design, and (3) reducing the H3+ contribution of the mass-spectrometer ion source. For replicate analysis of a water sample, the standard deviation improved to 0.8???. H2S-bearing samples and samples as small as 0.1 mL can be analyzed routinely with this method.
Search automation of the generalized method of device operational characteristics improvement
NASA Astrophysics Data System (ADS)
Petrova, I. Yu; Puchkova, A. A.; Zaripova, V. M.
2017-01-01
The article presents brief results of analysis of existing search methods of the closest patents, which can be applied to determine generalized methods of device operational characteristics improvement. There were observed the most widespread clustering algorithms and metrics for determining the proximity degree between two documents. The article proposes the technique of generalized methods determination; it has two implementation variants and consists of 7 steps. This technique has been implemented in the “Patents search” subsystem of the “Intellect” system. Also the article gives an example of the use of the proposed technique.
Group decision-making techniques for natural resource management applications
Coughlan, Beth A.K.; Armour, Carl L.
1992-01-01
This report is an introduction to decision analysis and problem-solving techniques for professionals in natural resource management. Although these managers are often called upon to make complex decisions, their training in the natural sciences seldom provides exposure to the decision-making tools developed in management science. Our purpose is to being to fill this gap. We present a general analysis of the pitfalls of group problem solving, and suggestions for improved interactions followed by the specific techniques. Selected techniques are illustrated. The material is easy to understand and apply without previous training or excessive study and is applicable to natural resource management issues.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.
Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian
2018-01-19
This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Brooks, Roger C.
This report describes a program designed to improve science achievement among students in grades 4-6 in a New Hampshire school. The areas of improvement included physical, earth, and life sciences. Analysis of the problem indicated a need for improved teaching techniques and for additional materials related to the instructional strategies. The…
2013-05-30
In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the
NASA Technical Reports Server (NTRS)
Klein, M.; Reynolds, J.; Ricks, E.
1989-01-01
Load and stress recovery from transient dynamic studies are improved upon using an extended acceleration vector in the modal acceleration technique applied to structural analysis. Extension of the normal LTM (load transformation matrices) stress recovery to automatically compute margins of safety is presented with an application to the Hubble space telescope.
Ping Kong; Patricia A. Richardson; Chuanxue Hong; Thomas L. Kubisiak
2006-01-01
At the first Sudden Oak Death Science Symposium, we reported on the use of a single strand conformation polymorphism (SSCP) analysis for rapid identification of Phytophthora ramorum in culture. We have since assessed and improved the fingerprinting technique for detecting this pathogen directly from plant tissues. The improved SSCP protocol uses a...
Automation of energy demand forecasting
NASA Astrophysics Data System (ADS)
Siddique, Sanzad
Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.
Rochat, Lucien; Manolov, Rumen; Billieux, Joël
2018-06-01
Metacognitive therapy and one of its treatment components, the attention training technique, are increasingly being delivered to improve mental health. We examined the efficacy of metacognitive therapy and/or attention training technique on mental health outcomes from single-case studies. A total of 14 studies (53 patients) were included. We used the d-statistic for multiple baseline data and the percentage change index to compute the effect sizes. Metacognitive therapy has a large effect on depression, anxiety, other psychopathological symptoms, and all outcomes together. Effect sizes were significantly moderated by the number of sessions, the severity and duration of symptoms, and patient gender, but not by study quality or attention training technique when used as a stand-alone treatment. At the follow-up, 77.36% of the individuals were considered recovered or had maintained improvement. Metacognitive therapy and attention training technique strongly contribute to improving mental health outcomes. This study effectively informs evidence-based practice in the clinical milieu. © 2017 Wiley Periodicals, Inc.
Improving the analysis of slug tests
McElwee, C.D.
2002-01-01
This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.
Preliminary Evaluation of BIM-based Approaches for Schedule Delay Analysis
NASA Astrophysics Data System (ADS)
Chou, Hui-Yu; Yang, Jyh-Bin
2017-10-01
The problem of schedule delay commonly occurs in construction projects. The quality of delay analysis depends on the availability of schedule-related information and delay evidence. More information used in delay analysis usually produces more accurate and fair analytical results. How to use innovative techniques to improve the quality of schedule delay analysis results have received much attention recently. As Building Information Modeling (BIM) technique has been quickly developed, using BIM and 4D simulation techniques have been proposed and implemented. Obvious benefits have been achieved especially in identifying and solving construction consequence problems in advance of construction. This study preforms an intensive literature review to discuss the problems encountered in schedule delay analysis and the possibility of using BIM as a tool in developing a BIM-based approach for schedule delay analysis. This study believes that most of the identified problems can be dealt with by BIM technique. Research results could be a fundamental of developing new approaches for resolving schedule delay disputes.
Real-time emergency forecasting technique for situation management systems
NASA Astrophysics Data System (ADS)
Kopytov, V. V.; Kharechkin, P. V.; Naumenko, V. V.; Tretyak, R. S.; Tebueva, F. B.
2018-05-01
The article describes the real-time emergency forecasting technique that allows increasing accuracy and reliability of forecasting results of any emergency computational model applied for decision making in situation management systems. Computational models are improved by the Improved Brown’s method applying fractal dimension to forecast short time series data being received from sensors and control systems. Reliability of emergency forecasting results is ensured by the invalid sensed data filtering according to the methods of correlation analysis.
Wang, Xinyu; Gao, Jing-Lin; Du, Chaohui; An, Jing; Li, MengJiao; Ma, Haiyan; Zhang, Lina; Jiang, Ye
2017-01-01
People today have a stronger interest in the risk of biosafety in clinical bioanalysis. A safe, simple, effective method of preparation is needed urgently. To improve biosafety of clinical analysis, we used antiviral drugs of adefovir and tenofovir as model drugs and developed a safe pretreatment method combining sealing technique with direct injection technique. The inter- and intraday precision (RSD %) of the method were <4%, and the extraction recoveries ranged from 99.4 to 100.7%. Meanwhile, the results showed that standard solution could be used to prepare calibration curve instead of spiking plasma, acquiring more accuracy result. Compared with traditional methods, the novel method not only improved biosecurity of the pretreatment method significantly, but also achieved several advantages including higher precision, favorable sensitivity and satisfactory recovery. With these highly practical and desirable characteristics, the novel method may become a feasible platform in bioanalysis.
An unsupervised classification technique for multispectral remote sensing data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Cummings, R. E.
1973-01-01
Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.
DOT National Transportation Integrated Search
2014-04-01
Risk management techniques are used to analyze fluctuations in uncontrollable variables and keep those fluctuations from impeding : the core function of a system or business. Examples of this are making sure that volatility in copper and aluminum pri...
On-Error Training (Book Excerpt).
ERIC Educational Resources Information Center
Fukuda, Ryuji
1985-01-01
This excerpt from "Managerial Engineering: Techniques for Improving Quality and Productivity in the Workplace" describes the development, objectives, and use of On-Error Training (OET), a method which trains workers to learn from their errors. Also described is New Joharry's Window, a performance-error data analysis technique used in…
Instructional Materials for Improved Job Performance.
ERIC Educational Resources Information Center
Foley, John P., Jr.
1978-01-01
Instructional materials developed in military research to improve performance of electromechanical maintenance tasks are described, with implications for teacher education. The materials require task analysis, job task relevance, and task-oriented training. Although many industries have implemented these techniques, teacher training institutions…
Helping agencies improve their planning analysis techniques.
DOT National Transportation Integrated Search
2011-11-18
This report summarizes the results of a peer review of the AZTDM. The peer review was : supported by the Travel Model Improvement Program (TMIP), which is sponsored by FHWA. : The peer review of a travel model can serve multiple purposes, including i...
A case study of the sensitivity of forecast skill to data and data analysis techniques
NASA Technical Reports Server (NTRS)
Baker, W. E.; Atlas, R.; Halem, M.; Susskind, J.
1983-01-01
A series of experiments have been conducted to examine the sensitivity of forecast skill to various data and data analysis techniques for the 0000 GMT case of January 21, 1979. These include the individual components of the FGGE observing system, the temperatures obtained with different satellite retrieval methods, and the method of vertical interpolation between the mandatory pressure analysis levels and the model sigma levels. It is found that NESS TIROS-N infrared retrievals seriously degrade a rawinsonde-only analysis over land, resulting in a poorer forecast over North America. Less degradation in the 72-hr forecast skill at sea level and some improvement at 500 mb is noted, relative to the control with TIROS-N retrievals produced with a physical inversion method which utilizes a 6-hr forecast first guess. NESS VTPR oceanic retrievals lead to an improved forecast over North America when added to the control.
Hamrin Senorski, Eric; Sundemo, David; Murawski, Christopher D; Alentorn-Geli, Eduard; Musahl, Volker; Fu, Freddie; Desai, Neel; Stålman, Anders; Samuelsson, Kristian
2017-12-01
The purpose of this study was to investigate how different techniques of single-bundle anterior cruciate ligament (ACL) reconstruction affect subjective knee function via the Knee injury and Osteoarthritis Outcome Score (KOOS) evaluation 2 years after surgery. It was hypothesized that the surgical techniques of single-bundle ACL reconstruction would result in equivalent results with respect to subjective knee function 2 years after surgery. This cohort study was based on data from the Swedish National Knee Ligament Register during the 10-year period of 1 January 2005 through 31 December 2014. Patients who underwent primary single-bundle ACL reconstruction with hamstrings tendon autograft were included. Details on surgical technique were collected using a web-based questionnaire comprised of essential AARSC items, including utilization of accessory medial portal drilling, anatomic tunnel placement, and visualization of insertion sites and landmarks. A repeated measures ANOVA and an additional linear mixed model analysis were used to investigate the effect of surgical technique on the KOOS 4 from the pre-operative period to 2-year follow-up. A total of 13,636 patients who had undergone single-bundle ACL reconstruction comprised the study group for this analysis. A repeated measures ANOVA determined that mean subjective knee function differed between the pre-operative time period and at 2-year follow-up (p < 0.001). No differences were found with respect to the interaction between KOOS 4 and surgical technique or gender. Additionally, the linear mixed model adjusted for age at reconstruction, gender, and concomitant injuries showed no difference between surgical techniques in KOOS 4 improvement from baseline to 2-year follow-up. However, KOOS 4 improved significantly in patients for all surgical techniques of single-bundle ACL reconstruction (p < 0.001); the largest improvement was seen between the pre-operative time period and at 1-year follow-up. Surgical techniques of primary single-bundle ACL reconstruction did not demonstrate differences in the improvement in baseline subjective knee function as measured with the KOOS 4 during the first 2 years after surgery. However, subjective knee function improved from pre-operative baseline to 2-year follow-up independently of surgical technique.
Usage of information safety requirements in improving tube bending process
NASA Astrophysics Data System (ADS)
Livshitz, I. I.; Kunakov, E.; Lontsikh, P. A.
2018-05-01
This article is devoted to an improvement of the technological process's analysis with the information security requirements implementation. The aim of this research is the competition increase analysis in aircraft industry enterprises due to the information technology implementation by the example of the tube bending technological process. The article analyzes tube bending kinds and current technique. In addition, a potential risks analysis in a tube bending technological process is carried out in terms of information security.
On the next generation of reliability analysis tools
NASA Technical Reports Server (NTRS)
Babcock, Philip S., IV; Leong, Frank; Gai, Eli
1987-01-01
The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.
The use of interpractive graphic displays for interpretation of surface design parameters
NASA Technical Reports Server (NTRS)
Talcott, N. A., Jr.
1981-01-01
An interactive computer graphics technique known as the Graphic Display Data method has been developed to provide a convenient means for rapidly interpreting large amounts of surface design data. The display technique should prove valuable in such disciplines as aerodynamic analysis, structural analysis, and experimental data analysis. To demonstrate the system's features, an example is presented of the Graphic Data Display method used as an interpretive tool for radiation equilibrium temperature distributions over the surface of an aerodynamic vehicle. Color graphic displays were also examined as a logical extension of the technique to improve its clarity and to allow the presentation of greater detail in a single display.
Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén
2006-04-21
A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.
Six Sigma Approach to Improve Stripping Quality of Automotive Electronics Component – a case study
NASA Astrophysics Data System (ADS)
Razali, Noraini Mohd; Murni Mohamad Kadri, Siti; Con Ee, Toh
2018-03-01
Lacking of problem solving skill techniques and cooperation between support groups are the two obstacles that always been faced in actual production line. Inadequate detail analysis and inappropriate technique in solving the problem may cause the repeating issues which may give impact to the organization performance. This study utilizes a well-structured six sigma DMAIC with combination of other problem solving tools to solve product quality problem in manufacturing of automotive electronics component. The study is concentrated at the stripping process, a critical process steps with highest rejection rate that contribute to the scrap and rework performance. The detail analysis is conducted in the analysis phase to identify the actual root cause of the problem. Then several improvement activities are implemented and the results show that the rejection rate due to stripping defect decrease tremendously and the process capability index improved from 0.75 to 1.67. This results prove that the six sigma approach used to tackle the quality problem is substantially effective.
Hirokawa, Yuusuke; Isoda, Hiroyoshi; Maetani, Yoji S; Arizono, Shigeki; Shimada, Kotaro; Okada, Tomohisa; Shibata, Toshiya; Togashi, Kaori
2009-05-01
To evaluate the effectiveness of the periodically rotated overlapping parallel lines with enhanced reconstruction (PROPELLER) technique for superparamagnetic iron oxide (SPIO)-enhanced T2-weighted magnetic resonance (MR) imaging with respiratory compensation with the prospective acquisition correction (PACE) technique in the detection of hepatic lesions. The institutional human research committee approved this prospective study, and all patients provided written informed consent. Eighty-one patients (mean age, 58 years) underwent hepatic 1.5-T MR imaging. Fat-saturated T2-weighted turbo spin-echo images were acquired with the PACE technique and with and without the PROPELLER method after administration of SPIO. Images were qualitatively evaluated for image artifacts, depiction of liver edge and intrahepatic vessels, overall image quality, and presence of lesions. Three radiologists independently assessed these characteristics with a five-point confidence scale. Diagnostic performance was assessed with receiver operating characteristic (ROC) curve analysis. Quantitative analysis was conducted by measuring the liver signal-to-noise ratio (SNR) and the lesion-to-liver contrast-to-noise ratio (CNR). The Wilcoxon signed rank test and two-tailed Student t test were used, and P < .05 indicated a significant difference. MR imaging with the PROPELLER and PACE techniques resulted in significantly improved image quality, higher sensitivity, and greater area under the ROC curve for hepatic lesion detection than did MR imaging with the PACE technique alone (P < .001). The mean liver SNR and the lesion-to-liver CNR were higher with the PROPELLER technique than without it (P < .001). T2-weighted MR imaging with the PROPELLER and PACE technique and SPIO enhancement is a promising method with which to improve the detection of hepatic lesions. (c) RSNA, 2009.
ERIC Educational Resources Information Center
Bagaiolo, Leila F.; Mari, Jair de J.; Bordini, Daniela; Ribeiro, Tatiane C.; Martone, Maria Carolina C.; Caetano, Sheila C.; Brunoni, Decio; Brentani, Helena; Paula, Cristiane S.
2017-01-01
Video modeling using applied behavior analysis techniques is one of the most promising and cost-effective ways to improve social skills for parents with autism spectrum disorder children. The main objectives were: (1) To elaborate/describe videos to improve eye contact and joint attention, and to decrease disruptive behaviors of autism spectrum…
Potvin, Christopher M; Zhou, Hongde
2011-11-01
The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks.
NASA Technical Reports Server (NTRS)
Moses, J. Daniel
1989-01-01
Three improvements in photographic x-ray imaging techniques for solar astronomy are presented. The testing and calibration of a new film processor was conducted; the resulting product will allow photometric development of sounding rocket flight film immediately upon recovery at the missile range. Two fine grained photographic films were calibrated and flight tested to provide alternative detector choices when the need for high resolution is greater than the need for high sensitivity. An analysis technique used to obtain the characteristic curve directly from photographs of UV solar spectra were applied to the analysis of soft x-ray photographic images. The resulting procedure provides a more complete and straightforward determination of the parameters describing the x-ray characteristic curve than previous techniques. These improvements fall into the category of refinements instead of revolutions, indicating the fundamental suitability of the photographic process for x-ray imaging in solar astronomy.
Synchronous Stroboscopic Electronic Speckle Pattern Interferometry
NASA Astrophysics Data System (ADS)
Soares, Oliverio D. D.
1986-10-01
Electronic Speckle Pattern Interferometry (E.S.P.I) oftenly called Electronic Holography is a practical powerful technique in non-destructive testing. Practical capabilities of the technique have been improved by fringe betterment and the control of analysis in the time domain, in particular, the scanning of the vibration cycle, with introduction of: synchronized amplitude and phase modulated pulse illumination, microcomputer control, fibre optics design, and moire evaluation techniques.
Covariate selection with iterative principal component analysis for predicting physical
USDA-ARS?s Scientific Manuscript database
Local and regional soil data can be improved by coupling new digital soil mapping techniques with high resolution remote sensing products to quantify both spatial and absolute variation of soil properties. The objective of this research was to advance data-driven digital soil mapping techniques for ...
Health Lifestyles: Audience Segmentation Analysis for Public Health Interventions.
ERIC Educational Resources Information Center
Slater, Michael D.; Flora, June A.
This paper is concerned with the application of market research techniques to segment large populations into homogeneous units in order to improve the reach, utilization, and effectiveness of health programs. The paper identifies seven distinctive patterns of health attitudes, social influences, and behaviors using cluster analytic techniques in a…
NASA Technical Reports Server (NTRS)
Niell, Arthur; Cappallo, Roger; Corey, Brian; Titus, Mike
2013-01-01
Analysis activities at Haystack Observatory are directed towards improving the accuracy of geodetic measurements, whether these are from VLBI, GNSS, SLR, or any other technique. Those analysis activities that are related to technology development are reported elsewhere in this volume. In this report, a preliminary analysis of the first geodetic sessions with the new broadband geodetic VLBI system is reported.
POD/MAC-Based Modal Basis Selection for a Reduced Order Nonlinear Response Analysis
NASA Technical Reports Server (NTRS)
Rizzi, Stephen A.; Przekop, Adam
2007-01-01
A feasibility study was conducted to explore the applicability of a POD/MAC basis selection technique to a nonlinear structural response analysis. For the case studied the application of the POD/MAC technique resulted in a substantial improvement of the reduced order simulation when compared to a classic approach utilizing only low frequency modes present in the excitation bandwidth. Further studies are aimed to expand application of the presented technique to more complex structures including non-planar and two-dimensional configurations. For non-planar structures the separation of different displacement components may not be necessary or desirable.
NASA Astrophysics Data System (ADS)
Liu, Qiang; Chattopadhyay, Aditi
2000-06-01
Aeromechanical stability plays a critical role in helicopter design and lead-lag damping is crucial to this design. In this paper, the use of segmented constrained damping layer (SCL) treatment and composite tailoring is investigated for improved rotor aeromechanical stability using formal optimization technique. The principal load-carrying member in the rotor blade is represented by a composite box beam, of arbitrary thickness, with surface bonded SCLs. A comprehensive theory is used to model the smart box beam. A ground resonance analysis model and an air resonance analysis model are implemented in the rotor blade built around the composite box beam with SCLs. The Pitt-Peters dynamic inflow model is used in air resonance analysis under hover condition. A hybrid optimization technique is used to investigate the optimum design of the composite box beam with surface bonded SCLs for improved damping characteristics. Parameters such as stacking sequence of the composite laminates and placement of SCLs are used as design variables. Detailed numerical studies are presented for aeromechanical stability analysis. It is shown that optimum blade design yields significant increase in rotor lead-lag regressive modal damping compared to the initial system.
NASA Technical Reports Server (NTRS)
1985-01-01
Topics covered include: data systems and quality; analysis and assimilation techniques; impacts on forecasts; tropical forecasts; analysis intercomparisons; improvements in predictability; and heat sources and sinks.
Zakaria, Ammar; Shakaff, Ali Yeon Md; Masnan, Maz Jamilah; Saad, Fathinul Syahir Ahmad; Adom, Abdul Hamid; Ahmad, Mohd Noor; Jaafar, Mahmad Nor; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2012-01-01
In recent years, there have been a number of reported studies on the use of non-destructive techniques to evaluate and determine mango maturity and ripeness levels. However, most of these reported works were conducted using single-modality sensing systems, either using an electronic nose, acoustics or other non-destructive measurements. This paper presents the work on the classification of mangoes (Magnifera Indica cv. Harumanis) maturity and ripeness levels using fusion of the data of an electronic nose and an acoustic sensor. Three groups of samples each from two different harvesting times (week 7 and week 8) were evaluated by the e-nose and then followed by the acoustic sensor. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) were able to discriminate the mango harvested at week 7 and week 8 based solely on the aroma and volatile gases released from the mangoes. However, when six different groups of different maturity and ripeness levels were combined in one classification analysis, both PCA and LDA were unable to discriminate the age difference of the Harumanis mangoes. Instead of six different groups, only four were observed using the LDA, while PCA showed only two distinct groups. By applying a low level data fusion technique on the e-nose and acoustic data, the classification for maturity and ripeness levels using LDA was improved. However, no significant improvement was observed using PCA with data fusion technique. Further work using a hybrid LDA-Competitive Learning Neural Network was performed to validate the fusion technique and classify the samples. It was found that the LDA-CLNN was also improved significantly when data fusion was applied. PMID:22778629
Zakaria, Ammar; Shakaff, Ali Yeon Md; Masnan, Maz Jamilah; Saad, Fathinul Syahir Ahmad; Adom, Abdul Hamid; Ahmad, Mohd Noor; Jaafar, Mahmad Nor; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2012-01-01
In recent years, there have been a number of reported studies on the use of non-destructive techniques to evaluate and determine mango maturity and ripeness levels. However, most of these reported works were conducted using single-modality sensing systems, either using an electronic nose, acoustics or other non-destructive measurements. This paper presents the work on the classification of mangoes (Magnifera Indica cv. Harumanis) maturity and ripeness levels using fusion of the data of an electronic nose and an acoustic sensor. Three groups of samples each from two different harvesting times (week 7 and week 8) were evaluated by the e-nose and then followed by the acoustic sensor. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) were able to discriminate the mango harvested at week 7 and week 8 based solely on the aroma and volatile gases released from the mangoes. However, when six different groups of different maturity and ripeness levels were combined in one classification analysis, both PCA and LDA were unable to discriminate the age difference of the Harumanis mangoes. Instead of six different groups, only four were observed using the LDA, while PCA showed only two distinct groups. By applying a low level data fusion technique on the e-nose and acoustic data, the classification for maturity and ripeness levels using LDA was improved. However, no significant improvement was observed using PCA with data fusion technique. Further work using a hybrid LDA-Competitive Learning Neural Network was performed to validate the fusion technique and classify the samples. It was found that the LDA-CLNN was also improved significantly when data fusion was applied.
Cognitive task analysis of network analysts and managers for network situational awareness
NASA Astrophysics Data System (ADS)
Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn
2010-01-01
The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.
ERIC Educational Resources Information Center
Butler, Stephanie K.; Harvey, Robert J.
1988-01-01
Examined technique for improving cost-effectiveness of Position Analysis Questionnaire (PAQ) in job analysis. Professional job analysts, industrial psychology graduate students familiar with PAQ, and PAQ-unfamiliar undergraduates made direct holistic ratings of PAQ dimensions for four familiar jobs. Comparison of holistic ratings with decomposed…
An Analysis of and a Prescription for the Capital Improvement Programming Process for Small Cities.
1980-12-01
thorough analysis and discussion in texts re- lating to business finance, managerial finance, and management accounting . The most notable (that is, the...Theory of the Firm, Prentice-Hall, 1963. DeMoville, W., "Capital Budgeting in Municipalities," Management Accounting , v. 59, no. 1, p. 17-20, 28, July...involves the use of the present value technique. This technique is adequately explained in the literature of basic business finance and management
Unsupervised classification of earth resources data.
NASA Technical Reports Server (NTRS)
Su, M. Y.; Jayroe, R. R., Jr.; Cummings, R. E.
1972-01-01
A new clustering technique is presented. It consists of two parts: (a) a sequential statistical clustering which is essentially a sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by existing supervised maximum liklihood classification technique.
Malo, Sergio; Fateri, Sina; Livadas, Makis; Mares, Cristinel; Gan, Tat-Hean
2017-07-01
Ultrasonic guided waves testing is a technique successfully used in many industrial scenarios worldwide. For many complex applications, the dispersive nature and multimode behavior of the technique still poses a challenge for correct defect detection capabilities. In order to improve the performance of the guided waves, a 2-D compressed pulse analysis is presented in this paper. This novel technique combines the use of pulse compression and dispersion compensation in order to improve the signal-to-noise ratio (SNR) and temporal-spatial resolution of the signals. The ability of the technique to discriminate different wave modes is also highlighted. In addition, an iterative algorithm is developed to identify the wave modes of interest using adaptive peak detection to enable automatic wave mode discrimination. The employed algorithm is developed in order to pave the way for further in situ applications. The performance of Barker-coded and chirp waveforms is studied in a multimodal scenario where longitudinal and flexural wave packets are superposed. The technique is tested in both synthetic and experimental conditions. The enhancements in SNR and temporal resolution are quantified as well as their ability to accurately calculate the propagation distance for different wave modes.
2012-06-01
Conducting metrology, surface analysis, and metallography/ fractography interrogations of samples to correlate microstructure with friction...are examined using a variety of methods such as metallography, chemical analysis, fractography , and hardness measurements. These methods assist in
Analysis of Learning Curve Fitting Techniques.
1987-09-01
1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied
The Need For Dedicated Bifurcation Stents: A Critical Analysis
Lesiak, Maciej
2016-01-01
There is growing evidence that optimally performed two-stent techniques may provide similar or better results compared with the simple techniques for bifurcation lesions, with an observed trend towards improvements in clinical and/or angiographic outcomes with a two-stent strategy. Yet, provisional stenting remains the treatment of choice. Here, the author discusses the evidence – and controversies – concerning when and how to use complex techniques. PMID:29588719
Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis
NASA Astrophysics Data System (ADS)
Awrangjeb, M.; Fraser, C. S.; Lu, G.
2015-08-01
Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.
Nicolodelli, Gustavo; Senesi, Giorgio Saverio; de Oliveira Perazzoli, Ivan Luiz; Marangoni, Bruno Spolon; De Melo Benites, Vinícius; Milori, Débora Marcondes Bastos Pereira
2016-09-15
Organic fertilizers are obtained from waste of plant or animal origin. One of the advantages of organic fertilizers is that, from the composting, it recycles waste-organic of urban and agriculture origin, whose disposal would cause environmental impacts. Fast and accurate analysis of both major and minor/trace elements contained in organic mineral and inorganic fertilizers of new generation have promoted the application of modern analytical techniques. In particular, laser induced breakdown spectroscopy (LIBS) is showing to be a very promising, quick and practical technique to detect and measure contaminants and nutrients in fertilizers. Although, this technique presents some limitations, such as a low sensitivity, if compared to other spectroscopic techniques, the use of double pulse (DP) LIBS is an alternative to the conventional LIBS in single pulse (SP). The macronutrients (Ca, Mg, K, P), micronutrients (Cu, Fe, Na, Mn, Zn) and contaminant (Cr) in fertilizer using LIBS in SP and DP configurations were evaluated. A comparative study for both configurations was performed using optimized key parameters for improving LIBS performance. The limit of detection (LOD) values obtained by DP LIBS increased up to seven times as compared to SP LIBS. In general, the marked improvement obtained when using DP system in the simultaneous LIBS quantitative determination for fertilizers analysis could be ascribed to the larger ablated mass of the sample. The results presented in this study show the promising potential of the DP LIBS technique for a qualitative analysis in fertilizers, without requiring sample preparation with chemical reagents. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Towner, Robert L.; Band, Jonathan L.
2012-01-01
An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.
Freire, Carmen S. R.; Coutinho, João A. P.; Silvestre, Armando J. D.; Freire, Mara G.
2016-01-01
Due to their unique properties, in recent years, ionic liquids (ILs) have been largely investigated in the field of analytical chemistry. Particularly during the last sixteen years, they have been successfully applied in the chromatographic and electrophoretic analysis of value-added compounds extracted from biomass. Considering the growing interest in the use of ILs in this field, this critical review provides a comprehensive overview on the improvements achieved using ILs as constituents of mobile or stationary phases in analytical techniques, namely in capillary electrophoresis and its different modes, in high performance liquid chromatography, and in gas chromatography, for the separation and analysis of natural compounds. The impact of the IL chemical structure and the influence of secondary parameters, such as the IL concentration, temperature, pH, voltage and analysis time (when applied), are also critically addressed regarding the achieved separation improvements. Major conclusions on the role of ILs in the separation mechanisms and the performance of these techniques in terms of efficiency, resolution and selectivity are provided. Based on a critical analysis of all published results, some target-oriented ILs are suggested. Finally, current drawbacks and future challenges in the field are highlighted. In particular, the design and use of more benign and effective ILs as well as the development of integrated (and thus more sustainable) extraction–separation processes using IL aqueous solutions are suggested within a green chemistry perspective. PMID:27667965
Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Calvo, Angels; Busto, Olga
2016-07-15
Three instrumental techniques, headspace-mass spectrometry (HS-MS), mid-infrared spectroscopy (MIR) and UV-visible spectrophotometry (UV-vis), have been combined to classify virgin olive oil samples based on the presence or absence of sensory defects. The reference sensory values were provided by an official taste panel. Different data fusion strategies were studied to improve the discrimination capability compared to using each instrumental technique individually. A general model was applied to discriminate high-quality non-defective olive oils (extra-virgin) and the lowest-quality olive oils considered non-edible (lampante). A specific identification of key off-flavours, such as musty, winey, fusty and rancid, was also studied. The data fusion of the three techniques improved the classification results in most of the cases. Low-level data fusion was the best strategy to discriminate musty, winey and fusty defects, using HS-MS, MIR and UV-vis, and the rancid defect using only HS-MS and MIR. The mid-level data fusion approach using partial least squares-discriminant analysis (PLS-DA) scores was found to be the best strategy for defective vs non-defective and edible vs non-edible oil discrimination. However, the data fusion did not sufficiently improve the results obtained by a single technique (HS-MS) to classify non-defective classes. These results indicate that instrumental data fusion can be useful for the identification of sensory defects in virgin olive oils. Copyright © 2016 Elsevier Ltd. All rights reserved.
Further Developments of the Fringe-Imaging Skin Friction Technique
NASA Technical Reports Server (NTRS)
Zilliac, Gregory C.
1996-01-01
Various aspects and extensions of the Fringe-Imaging Skin Friction technique (FISF) have been explored through the use of several benchtop experiments and modeling. The technique has been extended to handle three-dimensional flow fields with mild shear gradients. The optical and imaging system has been refined and a PC-based application has been written that has made it possible to obtain high resolution skin friction field measurements in a reasonable period of time. The improved method was tested on a wingtip and compared with Navier-Stokes computations. Additionally, a general approach to interferogram-fringe spacing analysis has been developed that should have applications in other areas of interferometry. A detailed error analysis of the FISF technique is also included.
Simultaneous F 0-F 1 modifications of Arabic for the improvement of natural-sounding
NASA Astrophysics Data System (ADS)
Ykhlef, F.; Bensebti, M.
2013-03-01
Pitch (F 0) modification is one of the most important problems in the area of speech synthesis. Several techniques have been developed in the literature to achieve this goal. The main restrictions of these techniques are in the modification range and the synthesised speech quality, intelligibility and naturalness. The control of formants in a spoken language can significantly improve the naturalness of the synthesised speech. This improvement is mainly dependent on the control of the first formant (F 1). Inspired by this observation, this article proposes a new approach that modifies both F 0 and F 1 of Arabic voiced sounds in order to improve the naturalness of the pitch shifted speech. The developed strategy takes a parallel processing approach, in which the analysis segments are decomposed into sub-bands in the wavelet domain, modified in the desired sub-band by using a resampling technique and reconstructed without affecting the remained sub-bands. Pitch marking and voicing detection are performed in the frequency decomposition step based on the comparison of the multi-level approximation and detail signals. The performance of the proposed technique is evaluated by listening tests and compared to the pitch synchronous overlap and add (PSOLA) technique in the third approximation level. Experimental results have shown that the manipulation in the wavelet domain of F 0 in conjunction with F 1 guarantees natural-sounding of the synthesised speech compared to the classical pitch modification technique. This improvement was appropriate for high pitch modifications.
Energy and technology review: Engineering modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cabayan, H.S.; Goudreau, G.L.; Ziolkowski, R.W.
1986-10-01
This report presents information concerning: Modeling Canonical Problems in Electromagnetic Coupling Through Apertures; Finite-Element Codes for Computing Electrostatic Fields; Finite-Element Modeling of Electromagnetic Phenomena; Modeling Microwave-Pulse Compression in a Resonant Cavity; Lagrangian Finite-Element Analysis of Penetration Mechanics; Crashworthiness Engineering; Computer Modeling of Metal-Forming Processes; Thermal-Mechanical Modeling of Tungsten Arc Welding; Modeling Air Breakdown Induced by Electromagnetic Fields; Iterative Techniques for Solving Boltzmann's Equations for p-Type Semiconductors; Semiconductor Modeling; and Improved Numerical-Solution Techniques in Large-Scale Stress Analysis.
Net present value analysis: appropriate for public utilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, W.N. III
1980-08-28
The net-present-value technique widely used by unregulated companies for capital budgeting can also apply to regulated public utilities. Used to decide whether an investment is worthwhile, the NPV technique discounts an investment's initial outlay or cost. The type of project most appropriate for an NPV analysis is that designed to lower costs. Efficiency-improving investments can be adequately evaluated by the NPV method, which in certain cases is easier to use than some of the more complicated revenue-requirement computer models.
Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane
2008-02-01
MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.
Kwon, Oh Kee; Han, Young Tak; Baek, Yong Soon; Chung, Yun C
2012-05-21
We present and demonstrate a simple and cost-effective technique for improving the modulation bandwidth of electroabsorption-modulated laser (EML). This technique utilizes the RF resonance caused by the EML chip (i.e., junction capacitance) and bonding wire (i.e, wire inductance). We analyze the effects of the lengths of the bonding wires on the frequency responses of EML by using an equivalent circuit model. To verify this analysis, we package a lumped EML chip on the sub-mount and measure its frequency responses. The results show that, by using the proposed technique, we can increase the modulation bandwidth of EML from ~16 GHz to ~28 GHz.
MeV ion-beam analysis of optical data storage films
NASA Technical Reports Server (NTRS)
Leavitt, J. A.; Mcintyre, L. C., Jr.; Lin, Z.
1993-01-01
Our objectives are threefold: (1) to accurately characterize optical data storage films by MeV ion-beam analysis (IBA) for ODSC collaborators; (2) to develop new and/or improved analysis techniques; and (3) to expand the capabilities of the IBA facility itself. Using H-1(+), He-4(+), and N-15(++) ion beams in the 1.5 MeV to 10 MeV energy range from a 5.5 MV Van de Graaff accelerator, film thickness (in atoms/sq cm), stoichiometry, impurity concentration profiles, and crystalline structure were determined by Rutherford backscattering (RBS), high-energy backscattering, channeling, nuclear reaction analysis (NRA) and proton induced X-ray emission (PIXE). Most of these techniques are discussed in detail in the ODSC Annual Report (February 17, 1987), p. 74. The PIXE technique is briefly discussed in the ODSC Annual Report (March 15, 1991), p. 23.
Biomedical application of MALDI mass spectrometry for small-molecule analysis.
van Kampen, Jeroen J A; Burgers, Peter C; de Groot, Ronald; Gruters, Rob A; Luider, Theo M
2011-01-01
Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) is an emerging analytical tool for the analysis of molecules with molar masses below 1,000 Da; that is, small molecules. This technique offers rapid analysis, high sensitivity, low sample consumption, a relative high tolerance towards salts and buffers, and the possibility to store sample on the target plate. The successful application of the technique is, however, hampered by low molecular weight (LMW) matrix-derived interference signals and by poor reproducibility of signal intensities during quantitative analyses. In this review, we focus on the biomedical application of MALDI-MS for the analysis of small molecules and discuss its favorable properties and its challenges as well as strategies to improve the performance of the technique. Furthermore, practical aspects and applications are presented. © 2010 Wiley Periodicals, Inc.
LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation
NASA Technical Reports Server (NTRS)
Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.
1997-01-01
This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.
ERIC Educational Resources Information Center
Lindle, Jane Clark; Stalion, Nancy; Young, Lu
2005-01-01
Kentucky's accountability system includes a school-processes audit known as Standards and Indicators for School Improvement (SISI), which is in a nascent stage of validation. Content validity methods include comparison to instruments measuring similar constructs as well as other techniques such as job analysis. This study used a two-phase process…
West Coast tree improvement programs: a break-even, cost-benefit analysis
F. Thomas Ledig; Richard L Porterfield
1981-01-01
Three tree improvement programs were analyzed by break-even, cost-benefit technique: one for ponderosa pine in the Pacific Northwest, and two for Douglas-fir in the Pacific Northwest-one of low intensity and the other of high intensity. A return of 8 percent on investment appears feasible by using short rotations or by accompanying tree improvement with thinning....
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, Joseph M.
1988-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or finite element analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid frequencies between the optimum frequency regimes for SEA and FEA. Power flow analysis has in general been used on 1-D beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to 2-D plate-like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA results at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
Extension of vibrational power flow techniques to two-dimensional structures
NASA Technical Reports Server (NTRS)
Cuschieri, J. M.
1987-01-01
In the analysis of the vibration response and structure-borne vibration transmission between elements of a complex structure, statistical energy analysis (SEA) or Finite Element Analysis (FEA) are generally used. However, an alternative method is using vibrational power flow techniques which can be especially useful in the mid- frequencies between the optimum frequency regimes for FEA and SEA. Power flow analysis has in general been used on one-dimensional beam-like structures or between structures with point joints. In this paper, the power flow technique is extended to two-dimensional plate like structures joined along a common edge without frequency or spatial averaging the results, such that the resonant response of the structure is determined. The power flow results are compared to results obtained using FEA at low frequencies and SEA at high frequencies. The agreement with FEA results is good but the power flow technique has an improved computational efficiency. Compared to the SEA results the power flow results show a closer representation of the actual response of the structure.
Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...
2014-10-03
Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Placement of riparian forest buffers to improve water quality
Mark D. Tomer; Michael G. Dosskey; Michael R. Burkart; David E. James; Matthew J. Helmers; Dean E. Eisenhauer
2005-01-01
Riparian forest buffers can improve stream water quality, provided they intercept and remove contaminants from surface runoff and/or shallow groundwater. Soils, topography, hydrology, and surficial geology detemine the capability of forest buffers to intercept and treat these flows. This paper describes landscape analysis techniques for identifying and mapping...
Assessing Institutional Ineffectiveness: A Strategy for Improvement.
ERIC Educational Resources Information Center
Cameron, Kim S.
1984-01-01
Based on the theory that institutional change and improvement are motivated more by knowledge of problems than by knowledge of successes, a fault tree analysis technique using Boolean logic for assessing institutional ineffectiveness by determining weaknesses in the system is presented. Advantages and disadvantages of focusing on weakness rather…
Risk Management Technique for design and operation of facilities and equipment
NASA Technical Reports Server (NTRS)
Fedor, O. H.; Parsons, W. N.; Coutinho, J. De S.
1975-01-01
The Risk Management System collects information from engineering, operating, and management personnel to identify potentially hazardous conditions. This information is used in risk analysis, problem resolution, and contingency planning. The resulting hazard accountability system enables management to monitor all identified hazards. Data from this system are examined in project reviews so that management can decide to eliminate or accept these risks. This technique is particularly effective in improving the management of risks in large, complex, high-energy facilities. These improvements are needed for increased cooperation among industry, regulatory agencies, and the public.
The improving efficiency frontier of inpatient rehabilitation hospitals.
Harrison, Jeffrey P; Kirkpatrick, Nicole
2011-01-01
This study uses a linear programming technique called data envelopment analysis to identify changes in the efficiency frontier of inpatient rehabilitation hospitals after implementation of the prospective payment system. The study provides a time series analysis of the efficiency frontier for inpatient rehabilitation hospitals in 2003 immediately after implementation of PPS and then again in 2006. Results indicate that the efficiency frontier of inpatient rehabilitation hospitals increased from 84% in 2003 to 85% in 2006. Similarly, an analysis of slack or inefficiency shows improvements in output efficiency over the study period. This clearly documents that efficiency in the inpatient rehabilitation hospital industry after implementation of PPS is improving. Hospital executives, health care policymakers, taxpayers, and other stakeholders benefit from studies that improve health care efficiency.
Jain, Mamta; Kumar, Anil; Choudhary, Rishabh Charan
2017-06-01
In this article, we have proposed an improved diagonal queue medical image steganography for patient secret medical data transmission using chaotic standard map, linear feedback shift register, and Rabin cryptosystem, for improvement of previous technique (Jain and Lenka in Springer Brain Inform 3:39-51, 2016). The proposed algorithm comprises four stages, generation of pseudo-random sequences (pseudo-random sequences are generated by linear feedback shift register and standard chaotic map), permutation and XORing using pseudo-random sequences, encryption using Rabin cryptosystem, and steganography using the improved diagonal queues. Security analysis has been carried out. Performance analysis is observed using MSE, PSNR, maximum embedding capacity, as well as by histogram analysis between various Brain disease stego and cover images.
Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.
Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini
2016-01-01
This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P < .001 and P < .003, respectively). Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.
[Comparison of 2 lacrimal punctal occlusion methods].
Shalaby, O; Rivas, L; Rivas, A I; Oroza, M A; Murube, J
2001-09-01
To study and compare two methods for canalicular occlusion: Cautery and Punctal Patch. The study included fourty patients divided in two groups of 20 patients. The end point was 4 occluded puncti. The first group underwent deep cauterization resulting in occlusion of the full vertical aspect of the canaliculus. The second group underwent punctal patch technique for canalicular occlusion. Differential parameters were the following: time of intervention, ease of use, risks and precision. In the post operatory, discomfort, subjective and objective improvement in ocular surface as well as long term result of each technique was analysed. Time of intervention was longer for punctal patch compared to cautery. Both methods exhibited similar ease of use and improvement in ocular surface. Precision was high in punctal patch technique showing complete and final occlusion and no punctum needed reopening, while cautery technique presented 20% rate of reopening intervention. Postoperatory discomfort and irritation were remarkably evident with punctal technique, while minimal in cautery technique. Survival analysis after one year follow up, showed a higher rate of advantages for punctal patch technique over cautery technique.
Problem Analysis: Application in Developing Marketing Strategies for Colleges.
ERIC Educational Resources Information Center
Martin, John; Moore, Thomas
1991-01-01
The problem analysis technique can help colleges understand students' salient needs in a competitive market. A preliminary study demonstrates the usefulness of the approach for developing strategies aimed at maintaining student loyalty and improving word-of-mouth promotion to other prospective students. (Author/MSE)
NEW APPROACHES IN RISK ANALYSIS OF ENVIRONMENTAL STRESSORS TO HUMAN AND ECOLOGICAL SYSTEMS
We explore the application of novel techniques for improving and integrating risk analysis of environmental stressors to human and ecological systems. Environmental protection decisions are guided by risk assessments serving as tools to develop regulatory policy and other relate...
Quality Assessment of College Admissions Processes.
ERIC Educational Resources Information Center
Fisher, Caroline; Weymann, Elizabeth; Todd, Amy
2000-01-01
This study evaluated the admissions process for a Master's in Business Administration Program using such quality improvement techniques as customer surveys, benchmarking, and gap analysis. Analysis revealed that student dissatisfaction with the admissions process may be a factor influencing declining enrollment. Cycle time and number of student…
Vetter, Jeffrey S.
2005-02-01
The method and system described herein presents a technique for performance analysis that helps users understand the communication behavior of their message passing applications. The method and system described herein may automatically classifies individual communication operations and reveal the cause of communication inefficiencies in the application. This classification allows the developer to quickly focus on the culprits of truly inefficient behavior, rather than manually foraging through massive amounts of performance data. Specifically, the method and system described herein trace the message operations of Message Passing Interface (MPI) applications and then classify each individual communication event using a supervised learning technique: decision tree classification. The decision tree may be trained using microbenchmarks that demonstrate both efficient and inefficient communication. Since the method and system described herein adapt to the target system's configuration through these microbenchmarks, they simultaneously automate the performance analysis process and improve classification accuracy. The method and system described herein may improve the accuracy of performance analysis and dramatically reduce the amount of data that users must encounter.
Dyer, Bryce; Hassani, Hossein; Shadi, Mehran
2016-01-01
The format of cycling time trials in England, Wales and Northern Ireland, involves riders competing individually over several fixed race distances of 10-100 miles in length and using time constrained formats of 12 and 24 h in duration. Drawing on data provided by the national governing body that covers the regions of England and Wales, an analysis of six male competition record progressions was undertaken to illustrate its progression. Future forecasts are then projected through use of the Singular Spectrum Analysis technique. This method has not been applied to sport-based time series data before. All six records have seen a progressive improvement and are non-linear in nature. Five records saw their highest level of record change during the 1950-1969 period. Whilst new record frequency generally has reduced since this period, the magnitude of performance improvement has generally increased. The Singular Spectrum Analysis technique successfully provided forecasted projections in the short to medium term with a high level of fit to the time series data.
Requirements analysis, domain knowledge, and design
NASA Technical Reports Server (NTRS)
Potts, Colin
1988-01-01
Two improvements to current requirements analysis practices are suggested: domain modeling, and the systematic application of analysis heuristics. Domain modeling is the representation of relevant application knowledge prior to requirements specification. Artificial intelligence techniques may eventually be applicable for domain modeling. In the short term, however, restricted domain modeling techniques, such as that in JSD, will still be of practical benefit. Analysis heuristics are standard patterns of reasoning about the requirements. They usually generate questions of clarification or issues relating to completeness. Analysis heuristics can be represented and therefore systematically applied in an issue-based framework. This is illustrated by an issue-based analysis of JSD's domain modeling and functional specification heuristics. They are discussed in the context of the preliminary design of simple embedded systems.
Improving sensor data analysis through diverse data source integration
NASA Astrophysics Data System (ADS)
Casper, Jennifer; Albuquerque, Ronald; Hyland, Jeremy; Leveille, Peter; Hu, Jing; Cheung, Eddy; Mauer, Dan; Couture, Ronald; Lai, Barry
2009-05-01
Daily sensor data volumes are increasing from gigabytes to multiple terabytes. The manpower and resources needed to analyze the increasing amount of data are not growing at the same rate. Current volumes of diverse data, both live streaming and historical, are not fully analyzed. Analysts are left mostly to analyzing the individual data sources manually. This is both time consuming and mentally exhausting. Expanding data collections only exacerbate this problem. Improved data management techniques and analysis methods are required to process the increasing volumes of historical and live streaming data sources simultaneously. Improved techniques are needed to reduce an analysts decision response time and to enable more intelligent and immediate situation awareness. This paper describes the Sensor Data and Analysis Framework (SDAF) system built to provide analysts with the ability to pose integrated queries on diverse live and historical data sources, and plug in needed algorithms for upstream processing and filtering. The SDAF system was inspired by input and feedback from field analysts and experts. This paper presents SDAF's capabilities, implementation, and reasoning behind implementation decisions. Finally, lessons learned from preliminary tests and deployments are captured for future work.
A study of data analysis techniques for the multi-needle Langmuir probe
NASA Astrophysics Data System (ADS)
Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.
2018-06-01
In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.
ASTM clustering for improving coal analysis by near-infrared spectroscopy.
Andrés, J M; Bona, M T
2006-11-15
Multivariate analysis techniques have been applied to near-infrared (NIR) spectra coals to investigate the relationship between nine coal properties (moisture (%), ash (%), volatile matter (%), fixed carbon (%), heating value (kcal/kg), carbon (%), hydrogen (%), nitrogen (%) and sulphur (%)) and the corresponding predictor variables. In this work, a whole set of coal samples was grouped into six more homogeneous clusters following the ASTM reference method for classification prior to the application of calibration methods to each coal set. The results obtained showed a considerable improvement of the error determination compared with the calibration for the whole sample set. For some groups, the established calibrations approached the quality required by the ASTM/ISO norms for laboratory analysis. To predict property values for a new coal sample it is necessary the assignation of that sample to its respective group. Thus, the discrimination and classification ability of coal samples by Diffuse Reflectance Infrared Fourier Transform Spectroscopy (DRIFTS) in the NIR range was also studied by applying Soft Independent Modelling of Class Analogy (SIMCA) and Linear Discriminant Analysis (LDA) techniques. Modelling of the groups by SIMCA led to overlapping models that cannot discriminate for unique classification. On the other hand, the application of Linear Discriminant Analysis improved the classification of the samples but not enough to be satisfactory for every group considered.
Sentence Similarity Analysis with Applications in Automatic Short Answer Grading
ERIC Educational Resources Information Center
Mohler, Michael A. G.
2012-01-01
In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…
Old Wine in New Bottles: The Quality of Work Life in Schools and School Districts.
ERIC Educational Resources Information Center
Bacharach, Samuel B.; Mitchell, Stephen M.
This essay reviews quality of work life as a management technique and argues that quality-of-work-life programs, conceptualized multidimensionally, offer a unique mechanism for improving working conditions in schools and within districts. A brief analysis of major management ideologies concludes that some techniques advocated under the label of…
An improved large-field focusing schlieren system
NASA Technical Reports Server (NTRS)
Weinstein, Leonard M.
1991-01-01
The analysis and performance of a high-brightness large-field focusing schlieren system is described. The system can be used to examine complex two- and three-dimensional flows. Techniques are described to obtain focusing schlieren through distorting optical elements, to use multiple colors in a time multiplexing technique, and to use diffuse screen holography for three-dimensional photographs.
Regression and Geostatistical Techniques: Considerations and Observations from Experiences in NE-FIA
Rachel Riemann; Andrew Lister
2005-01-01
Maps of forest variables improve our understanding of the forest resource by allowing us to view and analyze it spatially. The USDA Forest Service's Northeastern Forest Inventory and Analysis unit (NE-FIA) has used geostatistical techniques, particularly stochastic simulation, to produce maps and spatial data sets of FIA variables. That work underscores the...
Destruction or Loss of School Property: Analysis and Suggestions for Improvement of School Security.
ERIC Educational Resources Information Center
Nelken, Ira; Kline, Sam
In recent years the costs of school vandalism and the incidence of vandalism in the public schools have been rising. The study concerns itself with the application of production functions, Monte Carlo techniques, and Shannon's model of information theory to determine the most efficient use of preventive vandalism techniques in a large school…
Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.
NASA Technical Reports Server (NTRS)
Thornton, C. L.
1976-01-01
An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.
Improved key-rate bounds for practical decoy-state quantum-key-distribution systems
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Zhao, Qi; Razavi, Mohsen; Ma, Xiongfeng
2017-01-01
The decoy-state scheme is the most widely implemented quantum-key-distribution protocol in practice. In order to account for the finite-size key effects on the achievable secret key generation rate, a rigorous statistical fluctuation analysis is required. Originally, a heuristic Gaussian-approximation technique was used for this purpose, which, despite its analytical convenience, was not sufficiently rigorous. The fluctuation analysis has recently been made rigorous by using the Chernoff bound. There is a considerable gap, however, between the key-rate bounds obtained from these techniques and that obtained from the Gaussian assumption. Here we develop a tighter bound for the decoy-state method, which yields a smaller failure probability. This improvement results in a higher key rate and increases the maximum distance over which secure key exchange is possible. By optimizing the system parameters, our simulation results show that our method almost closes the gap between the two previously proposed techniques and achieves a performance similar to that of conventional Gaussian approximations.
Position and Speed Control of Brushless DC Motors Using Sensorless Techniques and Application Trends
Gamazo-Real, José Carlos; Vázquez-Sánchez, Ernesto; Gómez-Gil, Jaime
2010-01-01
This paper provides a technical review of position and speed sensorless methods for controlling Brushless Direct Current (BLDC) motor drives, including the background analysis using sensors, limitations and advances. The performance and reliability of BLDC motor drivers have been improved because the conventional control and sensing techniques have been improved through sensorless technology. Then, in this paper sensorless advances are reviewed and recent developments in this area are introduced with their inherent advantages and drawbacks, including the analysis of practical implementation issues and applications. The study includes a deep overview of state-of-the-art back-EMF sensing methods, which includes Terminal Voltage Sensing, Third Harmonic Voltage Integration, Terminal Current Sensing, Back-EMF Integration and PWM strategies. Also, the most relevant techniques based on estimation and models are briefly analysed, such as Sliding-mode Observer, Extended Kalman Filter, Model Reference Adaptive System, Adaptive observers (Full-order and Pseudoreduced-order) and Artificial Neural Networks. PMID:22163582
Novel method for the rapid isolation of RPE cells specifically for RNA extraction and analysis
Wang, Cynthia Xin-Zhao; Zhang, Kaiyan; Aredo, Bogale; Lu, Hua; Ufret-Vincenty, Rafael L.
2012-01-01
RPE cells are involved in the pathogenesis of many retinal diseases. Accurate analysis of RPE gene expression profiles in different scenarios will increase our understanding of disease mechanisms. Our objective in this study was to develop an improved method for the isolation of RPE cells, specifically for RNA analysis. Mouse RPE cells were isolated using different techniques, including mechanical dissociation techniques and a new technique we refer to here as “Simultaneous RPE cell Isolation and RNA Stabilization” (SRIRS method). RNA was extracted from the RPE cells. An RNA bioanalyzer was used to determine the quantity and quality of RNA. qPCR was used to determine contamination with non-RPE-derived RNA. Several parameters with a potential impact on the isolation protocol were studied and optimized. A marked improvement in the quantity and quality of RPE-derived RNA was obtained with the SRIRS technique. We could get the RPE in direct contact with the RNA protecting agent within 1 minute of enucleation, and the RPE isolated within 11 minutes of enucleation. There was no significant contamination with vascular, choroidal or scleral-derived RNA. We have developed a fast, easy and reliable method for the isolation of RPE cells that leads to a high yield of RPE-derived RNA while preserving its quality. We believe this technique will be useful for future studies looking at gene expression profiles of RPE cells and their role in the pathophysiology of retinal diseases. PMID:22721721
NASA Astrophysics Data System (ADS)
Okawa, Tsutomu; Kaminishi, Tsukasa; Kojima, Yoshiyuki; Hirabayashi, Syuichi; Koizumi, Hisao
Business process modeling (BPM) is gaining attention as a measure of analysis and improvement of the business process. BPM analyses the current business process as an AS-IS model and solves problems to improve the current business and moreover it aims to create a business process, which produces values, as a TO-BE model. However, researches of techniques that connect the business process improvement acquired by BPM to the implementation of the information system seamlessly are rarely reported. If the business model obtained by BPM is converted into UML, and the implementation can be carried out by the technique of UML, we can expect the improvement in efficiency of information system implementation. In this paper, we describe a method of the system development, which converts the process model obtained by BPM into UML and the method is evaluated by modeling a prototype of a parts procurement system. In the evaluation, comparison with the case where the system is implemented by the conventional UML technique without going via BPM is performed.
A FMEA clinical laboratory case study: how to make problems and improvements measurable.
Capunzo, Mario; Cavallo, Pierpaolo; Boccia, Giovanni; Brunetti, Luigi; Pizzuti, Sante
2004-01-01
The authors have experimented the application of the Failure Mode and Effect Analysis (FMEA) technique in a clinical laboratory. FMEA technique allows: a) to evaluate and measure the hazards of a process malfunction, b) to decide where to execute improvement actions, and c) to measure the outcome of those actions. A small sample of analytes has been studied: there have been determined the causes of the possible malfunctions of the analytical process, calculating the risk probability index (RPI), with a value between 1 and 1,000. Only for the cases of RPI > 400, improvement actions have been implemented that allowed a reduction of RPI values between 25% to 70% with a costs increment of < 1%. FMEA technique can be applied to the processes of a clinical laboratory, even if of small dimensions, and offers a high potential of improvement. Nevertheless, such activity needs a thorough planning because it is complex, even if the laboratory already operates an ISO 9000 Quality Management System.
Statistical process management: An essential element of quality improvement
NASA Astrophysics Data System (ADS)
Buckner, M. R.
Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.
A Universal Tare Load Prediction Algorithm for Strain-Gage Balance Calibration Data Analysis
NASA Technical Reports Server (NTRS)
Ulbrich, N.
2011-01-01
An algorithm is discussed that may be used to estimate tare loads of wind tunnel strain-gage balance calibration data. The algorithm was originally developed by R. Galway of IAR/NRC Canada and has been described in the literature for the iterative analysis technique. Basic ideas of Galway's algorithm, however, are universally applicable and work for both the iterative and the non-iterative analysis technique. A recent modification of Galway's algorithm is presented that improves the convergence behavior of the tare load prediction process if it is used in combination with the non-iterative analysis technique. The modified algorithm allows an analyst to use an alternate method for the calculation of intermediate non-linear tare load estimates whenever Galway's original approach does not lead to a convergence of the tare load iterations. It is also shown in detail how Galway's algorithm may be applied to the non-iterative analysis technique. Hand load data from the calibration of a six-component force balance is used to illustrate the application of the original and modified tare load prediction method. During the analysis of the data both the iterative and the non-iterative analysis technique were applied. Overall, predicted tare loads for combinations of the two tare load prediction methods and the two balance data analysis techniques showed excellent agreement as long as the tare load iterations converged. The modified algorithm, however, appears to have an advantage over the original algorithm when absolute voltage measurements of gage outputs are processed using the non-iterative analysis technique. In these situations only the modified algorithm converged because it uses an exact solution of the intermediate non-linear tare load estimate for the tare load iteration.
Selectivity/Specificity Improvement Strategies in Surface-Enhanced Raman Spectroscopy Analysis
Wang, Feng; Cao, Shiyu; Yan, Ruxia; Wang, Zewei; Wang, Dan; Yang, Haifeng
2017-01-01
Surface-enhanced Raman spectroscopy (SERS) is a powerful technique for the discrimination, identification, and potential quantification of certain compounds/organisms. However, its real application is challenging due to the multiple interference from the complicated detection matrix. Therefore, selective/specific detection is crucial for the real application of SERS technique. We summarize in this review five selective/specific detection techniques (chemical reaction, antibody, aptamer, molecularly imprinted polymers and microfluidics), which can be applied for the rapid and reliable selective/specific detection when coupled with SERS technique. PMID:29160798
Development and evaluation of an automatic labeling technique for spring small grains
NASA Technical Reports Server (NTRS)
Crist, E. P.; Malila, W. A. (Principal Investigator)
1981-01-01
A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.
Precise terrestrial time: A means for improved ballistic missile guidance analysis
NASA Technical Reports Server (NTRS)
Ehrsam, E. E.; Cresswell, S. A.; Mckelvey, G. R.; Matthews, F. L.
1978-01-01
An approach developed to improve the ground instrumentation time tagging accuracy and adapted to support the Minuteman ICBM program is desired. The Timing Insertion Unit (TIU) technique produces a telemetry data time tagging resolution of one tenth of a microsecond, with a relative intersite accuracy after corrections and velocity data (range, azimuth, elevation and range rate) also used in missile guidance system analysis can be correlated to within ten microseconds of the telemetry guidance data. This requires precise timing synchronization between the metric and telemetry instrumentation sites. The timing synchronization can be achieved by using the radar automatic phasing system time correlation methods. Other time correlation techniques such as Television (TV) Line-10 and the Geostationary Operational Environmental Satellites (GEOS) terrestial timing receivers are also considered.
Cost collection and analysis for health economic evaluation.
Smith, Kristine A; Rudmik, Luke
2013-08-01
To improve the understanding of common health care cost collection, estimation, analysis, and reporting methodologies. Ovid MEDLINE (1947 to December 2012), Cochrane Central register of Controlled Trials, Database of Systematic Reviews, Health Technology Assessment, and National Health Service Economic Evaluation Database. This article discusses the following cost collection methods: defining relevant resources, quantification of consumed resources, and resource valuation. It outlines the recommendations for cost reporting in economic evaluations and reviews the techniques on how to handle cost data uncertainty. Last, it discusses the controversial topics of future costs and patient productivity losses. Health care cost collection and estimation can be challenging, and an organized approach is required to optimize accuracy of economic evaluation outcomes. Understanding health care cost collection and estimation techniques will improve both critical appraisal and development of future economic evaluations.
ERIC Educational Resources Information Center
Barton, Mitch; Yeatts, Paul E.; Henson, Robin K.; Martin, Scott B.
2016-01-01
There has been a recent call to improve data reporting in kinesiology journals, including the appropriate use of univariate and multivariate analysis techniques. For example, a multivariate analysis of variance (MANOVA) with univariate post hocs and a Bonferroni correction is frequently used to investigate group differences on multiple dependent…
Monzani, Lucas; Espí-López, Gemma Victoria; Zurriaga, Rosario; Andersen, Lars L
2016-04-01
The objective of this research is to evaluate the efficacy of manual therapy for tension-type headache (TTH) in restoring workers quality of work life, and how work presenteeism affects this relation. This study is a secondary analysis of a factorial, randomized clinical trial on manual therapy interventions. Altogether, 80 patients (85% women) with TTH and without current symptoms of any other concomitant disease participated. An experienced therapist delivered the treatment: myofascial inhibitory technique (IT), articulatory technique (AT), combined technique (IT and AT), and control group (no treatment). In general, all treatments as compared to our control group had a large effect (f≥.69) in the improvement of participants' quality of work life. Work presenteeism interacted with TTH treatment type's efficacy on participant's quality of work life. The inhibitory technique lead to higher reports of quality of work life than other treatment options only for participants with very low frequency of work presenteeism. In turn, TTH articulatory treatment techniques resulted in higher reports of quality of work life for a high to very high work presenteeism frequency. Articulatory manipulation technique is the more efficient treatment to improve quality of work life when the frequency of work presenteeism is high. Implications for future research and practice are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Brain Jogging Training to Improve Motivation and Learning Result of Tennis Skills
NASA Astrophysics Data System (ADS)
Tafaqur, M.; Komarudin; Mulyana; Saputra, M. Y.
2017-03-01
This research is aimed to determine the effect of brain jogging towards improvement of motivation and learning result of tennis skills. The method used in this research is experimental method. The population of this research is 15 tennis athletes of Core Siliwangi Bandung Tennis Club. The sampling technique used in this research is purposive sampling technique. Sample of this research is the 10 tennis athletes of Core Siliwangi Bandung Tennis Club. Design used for this research is pretest-posttest group design. Data analysis technique used in this research is by doing Instrument T-test to measure motivation using The Sport Motivation Scale questionnaire (SMS-28) and Instrument to measure learning result of tennis skill by using tennis skills test, which include: (1) forehand test, (2) backhand test, and (3) service placement test. The result of this research showed that brain jogging significantly impact the improvement of motivation and learning result of tennis skills.
Kumar, Keshav
2018-03-01
Excitation-emission matrix fluorescence (EEMF) and total synchronous fluorescence spectroscopy (TSFS) are the 2 fluorescence techniques that are commonly used for the analysis of multifluorophoric mixtures. These 2 fluorescence techniques are conceptually different and provide certain advantages over each other. The manual analysis of such highly correlated large volume of EEMF and TSFS towards developing a calibration model is difficult. Partial least square (PLS) analysis can analyze the large volume of EEMF and TSFS data sets by finding important factors that maximize the correlation between the spectral and concentration information for each fluorophore. However, often the application of PLS analysis on entire data sets does not provide a robust calibration model and requires application of suitable pre-processing step. The present work evaluates the application of genetic algorithm (GA) analysis prior to PLS analysis on EEMF and TSFS data sets towards improving the precision and accuracy of the calibration model. The GA algorithm essentially combines the advantages provided by stochastic methods with those provided by deterministic approaches and can find the set of EEMF and TSFS variables that perfectly correlate well with the concentration of each of the fluorophores present in the multifluorophoric mixtures. The utility of the GA assisted PLS analysis is successfully validated using (i) EEMF data sets acquired for dilute aqueous mixture of four biomolecules and (ii) TSFS data sets acquired for dilute aqueous mixtures of four carcinogenic polycyclic aromatic hydrocarbons (PAHs) mixtures. In the present work, it is shown that by using the GA it is possible to significantly improve the accuracy and precision of the PLS calibration model developed for both EEMF and TSFS data set. Hence, GA must be considered as a useful pre-processing technique while developing an EEMF and TSFS calibration model.
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.
Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hettiarachchi, Ganga M.; Donner, Erica; Doelsch, Emmanuel
To understand the biogeochemistry of nutrients and contaminants in environmental media, their speciation and behavior under different conditions and at multiple scales must be determined. Synchrotron radiation-based X-ray techniques allow scientists to elucidate the underlying mechanisms responsible for nutrient and contaminant mobility, bioavailability, and behavior. The continuous improvement of synchrotron light sources and X-ray beamlines around the world has led to a profound transformation in the field of environmental biogeochemistry and, subsequently, to significant scientific breakthroughs. Following this introductory paper, this special collection includes 10 papers that either present targeted reviews of recent advancements in spectroscopic methods that are applicablemore » to environmental biogeochemistry or describe original research studies conducted on complex environmental samples that have been significantly enhanced by incorporating synchrotron radiation-based X-ray technique(s). We believe that the current focus on improving the speciation of ultra-dilute elements in environmental media through the ongoing optimization of synchrotron technologies (e.g., brighter light sources, improved monochromators, more efficient detectors) will help to significantly push back the frontiers of environmental biogeochemistry research. As many of the relevant techniques produce extremely large datasets, we also identify ongoing improvements in data processing and analysis (e.g., software improvements and harmonization of analytical methods) as a significant requirement for environmental biogeochemists to maximize the information that can be gained using these powerful tools.« less
Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method
Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.
2017-01-01
We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744
Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M
2015-06-21
The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.
Capacitor Technologies, Applications and Reliability
NASA Technical Reports Server (NTRS)
1981-01-01
Various aspects of capacitor technologies and applications are discussed. Major emphasis is placed on: the causes of failures; accelerated testing; screening tests; destructive physical analysis; applications techniques; and improvements in capacitor capabilities.
Cone Analysis of Southern Pines - A Guidebook
D.L. Bramlett; E.W. Belcher; G.L. DeBarr; G.D. Hertel; Robert P. Karrfalt; C.W. Lantz; T. Miller; K.D. Ware; H.O. Yates
1977-01-01
Southern pine tree improvement programs require an ample supply of improved seeds, but productron from southern pine seed orchards has often been disappointing. If high productron is to be malntained yields must be monitored and causes of seed losses must be identified. Techniques for determining seed efficiency were first used for red pine, Pinus resinosa...
Methods to prioritize placement of riparian buffers for improved water quality
Mark D. Tomer; Michael G. Dosskey; Michael R. Burkart; David E. James; Matthew J. Helmers; Dean E. Eisenhauer
2008-01-01
Agroforestry buffers in riparian zones can improve stream water quality, provided they intercept and remove contaminants from surface runoff and/or shallow groundwater. Soils, topography, surficial geology, and hydrology determine the capability of forest buffers to intercept and treat these flows. This paper describes two landscape analysis techniques for identifying...
Quantitative analysis on electrooculography (EOG) for neurodegenerative disease
NASA Astrophysics Data System (ADS)
Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.
2007-11-01
Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.
Tunable laser techniques for improving the precision of observational astronomy
NASA Astrophysics Data System (ADS)
Cramer, Claire E.; Brown, Steven W.; Lykke, Keith R.; Woodward, John T.; Bailey, Stephen; Schlegel, David J.; Bolton, Adam S.; Brownstein, Joel; Doherty, Peter E.; Stubbs, Christopher W.; Vaz, Amali; Szentgyorgyi, Andrew
2012-09-01
Improving the precision of observational astronomy requires not only new telescopes and instrumentation, but also advances in observing protocols, calibrations and data analysis. The Laser Applications Group at the National Institute of Standards and Technology in Gaithersburg, Maryland has been applying advances in detector metrology and tunable laser calibrations to problems in astronomy since 2007. Using similar measurement techniques, we have addressed a number of seemingly disparate issues: precision flux calibration for broad-band imaging, precision wavelength calibration for high-resolution spectroscopy, and precision PSF mapping for fiber spectrographs of any resolution. In each case, we rely on robust, commercially-available laboratory technology that is readily adapted to use at an observatory. In this paper, we give an overview of these techniques.
Ultrasonic non invasive techniques for microbiological instrumentation
NASA Astrophysics Data System (ADS)
Elvira, L.; Sierra, C.; Galán, B.; Resa, P.
2010-01-01
Non invasive techniques based on ultrasounds have advantageous features to study, characterize and monitor microbiological and enzymatic reactions. These processes may change the sound speed, viscosity or particle distribution size of the medium where they take place, which makes possible their analysis using ultrasonic techniques. In this work, two different systems for the analysis of microbiological liquid media based on ultrasounds are presented. In first place, an industrial application based on an ultrasonic monitoring technique for microbiological growth detection in milk is shown. Such a system may improve the quality control strategies in food production factories, being able to decrease the time required to detect possible contaminations in packed products. Secondly, a study about the growing of the Escherichia coli DH5 α in different conditions is presented. It is shown that the use of ultrasonic non invasive characterization techniques in combination with other conventional measurements like optical density provides complementary information about the metabolism of these bacteria.
Dipeptide Sequence Determination: Analyzing Phenylthiohydantoin Amino Acids by HPLC
NASA Astrophysics Data System (ADS)
Barton, Janice S.; Tang, Chung-Fei; Reed, Steven S.
2000-02-01
Amino acid composition and sequence determination, important techniques for characterizing peptides and proteins, are essential for predicting conformation and studying sequence alignment. This experiment presents improved, fundamental methods of sequence analysis for an upper-division biochemistry laboratory. Working in pairs, students use the Edman reagent to prepare phenylthiohydantoin derivatives of amino acids for determination of the sequence of an unknown dipeptide. With a single HPLC technique, students identify both the N-terminal amino acid and the composition of the dipeptide. This method yields good precision of retention times and allows use of a broad range of amino acids as components of the dipeptide. Students learn fundamental principles and techniques of sequence analysis and HPLC.
DOT National Transportation Integrated Search
2014-03-01
Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...
The Effect of Literature Circles on Text Analysis and Reading Desire
ERIC Educational Resources Information Center
Karatay, Halit
2017-01-01
In order to make teaching activities more appealing, different techniques and strategies have been constantly employed. This study utilized the strategy of "literature circles" to improve the text-analysis skills, reading desires, and interests of prospective teachers of Turkish. "Literature circles" was not chosen to be used…
USDA-ARS?s Scientific Manuscript database
As sample preparation and analytical techniques have improved, data handling has become the main limitation in automated high-throughput analysis of targeted chemicals in many applications. Conventional chromatographic peak integration functions rely on complex software and settings, but untrustwor...
2013-02-11
calibration curves was ±5%. Ion chromatography (IC) was used for analysis of perchlorate and other ionic targets. Analysis was carried out on a...The methods utilize liquid or gas chromatography , techniques that do not lend themselves well to portable devices and methods. Portable methods are...
Analysis in Motion Initiative – Summarization Capability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin; Pirrung, Meg; Jasper, Rob
2017-06-22
Analysts are tasked with integrating information from multiple data sources for important and timely decision making. What if sense making and overall situation awareness could be improved through visualization techniques? The Analysis in Motion initiative is advancing the ability to summarize and abstract multiple streams and static data sources over time.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
Scientific analysis of satellite ranging data
NASA Technical Reports Server (NTRS)
Smith, David E.
1994-01-01
A network of satellite laser ranging (SLR) tracking systems with continuously improving accuracies is challenging the modelling capabilities of analysts worldwide. Various data analysis techniques have yielded many advances in the development of orbit, instrument and Earth models. The direct measurement of the distance to the satellite provided by the laser ranges has given us a simple metric which links the results obtained by diverse approaches. Different groups have used SLR data, often in combination with observations from other space geodetic techniques, to improve models of the static geopotential, the solid Earth, ocean tides, and atmospheric drag models for low Earth satellites. Radiation pressure models and other non-conservative forces for satellite orbits above the atmosphere have been developed to exploit the full accuracy of the latest SLR instruments. SLR is the baseline tracking system for the altimeter missions TOPEX/Poseidon, and ERS-1 and will play an important role in providing the reference frame for locating the geocentric position of the ocean surface, in providing an unchanging range standard for altimeter calibration, and for improving the geoid models to separate gravitational from ocean circulation signals seen in the sea surface. However, even with the many improvements in the models used to support the orbital analysis of laser observations, there remain systematic effects which limit the full exploitation of SLR accuracy today.
Shi, Chaoyang; Kojima, Masahiro; Tercero, Carlos; Najdovski, Zoran; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto
2014-12-01
There are several complications associated with Stent-assisted Coil Embolization (SACE) in cerebral aneurysm treatments, due to damaging operations by surgeons and undesirable mechanical properties of stents. Therefore, it is necessary to develop an in vitro simulator that provides both training and research for evaluating the mechanical properties of stents. A new in vitro simulator for three-dimensional digital subtraction angiography was constructed, followed by aneurysm models fabricated with new materials. Next, this platform was used to provide training and to conduct photoelastic stress analysis to evaluate the SACE technique. The average interaction stress increasingly varied for the two different stents. Improvements for the Maximum-Likelihood Expectation-Maximization method were developed to reconstruct cross-sections with both thickness and stress information. The technique presented can improve a surgeon's skills and quantify the performance of stents to improve mechanical design and classification. This method can contribute to three-dimensional stress and volume variation evaluation and assess a surgeon's skills. Copyright © 2013 John Wiley & Sons, Ltd.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Van Leemput, Koen
2013-10-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer's disease classification task. As an additional benefit, the technique also allows one to compute informative "error bars" on the volume estimates of individual structures. Copyright © 2013 Elsevier B.V. All rights reserved.
Iglesias, Juan Eugenio; Sabuncu, Mert Rory; Leemput, Koen Van
2013-01-01
Many segmentation algorithms in medical image analysis use Bayesian modeling to augment local image appearance with prior anatomical knowledge. Such methods often contain a large number of free parameters that are first estimated and then kept fixed during the actual segmentation process. However, a faithful Bayesian analysis would marginalize over such parameters, accounting for their uncertainty by considering all possible values they may take. Here we propose to incorporate this uncertainty into Bayesian segmentation methods in order to improve the inference process. In particular, we approximate the required marginalization over model parameters using computationally efficient Markov chain Monte Carlo techniques. We illustrate the proposed approach using a recently developed Bayesian method for the segmentation of hippocampal subfields in brain MRI scans, showing a significant improvement in an Alzheimer’s disease classification task. As an additional benefit, the technique also allows one to compute informative “error bars” on the volume estimates of individual structures. PMID:23773521
Improved nonlinear prediction method
NASA Astrophysics Data System (ADS)
Adenan, Nur Hamiza; Md Noorani, Mohd Salmi
2014-06-01
The analysis and prediction of time series data have been addressed by researchers. Many techniques have been developed to be applied in various areas, such as weather forecasting, financial markets and hydrological phenomena involving data that are contaminated by noise. Therefore, various techniques to improve the method have been introduced to analyze and predict time series data. In respect of the importance of analysis and the accuracy of the prediction result, a study was undertaken to test the effectiveness of the improved nonlinear prediction method for data that contain noise. The improved nonlinear prediction method involves the formation of composite serial data based on the successive differences of the time series. Then, the phase space reconstruction was performed on the composite data (one-dimensional) to reconstruct a number of space dimensions. Finally the local linear approximation method was employed to make a prediction based on the phase space. This improved method was tested with data series Logistics that contain 0%, 5%, 10%, 20% and 30% of noise. The results show that by using the improved method, the predictions were found to be in close agreement with the observed ones. The correlation coefficient was close to one when the improved method was applied on data with up to 10% noise. Thus, an improvement to analyze data with noise without involving any noise reduction method was introduced to predict the time series data.
NASA Technical Reports Server (NTRS)
1975-01-01
A system analysis of the shuttle orbiter baseline system management (SM) computer function is performed. This analysis results in an alternative SM design which is also described. The alternative design exhibits several improvements over the baseline, some of which are increased crew usability, improved flexibility, and improved growth potential. The analysis consists of two parts: an application assessment and an implementation assessment. The former is concerned with the SM user needs and design functional aspects. The latter is concerned with design flexibility, reliability, growth potential, and technical risk. The system analysis is supported by several topical investigations. These include: treatment of false alarms, treatment of off-line items, significant interface parameters, and a design evaluation checklist. An in-depth formulation of techniques, concepts, and guidelines for design of automated performance verification is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Cory Thomas
2008-01-01
The focus of this dissertation is the development of techniques with which to enhance the existing abilities of inductively coupled plasma mass spectrometry (ICP-MS). ICP-MS is a powerful technique for trace metal analysis in samples of many types, but like any technique it has certain strengths and weaknesses. Attempts are made to improve upon those strengths and to overcome certain weaknesses.
Unknown sequence amplification: Application to in vitro genome walking in Chlamydia trachomatis L2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copley, C.G.; Boot, C.; Bundell, K.
1991-01-01
A recently described technique, Chemical Genetics' unknown sequence amplification method, which requires only one specific oligonucleotide, has broadened the applicability of the polymerase chain reaction to DNA of unknown sequence. The authors have adapted this technique to the study of the genome of Chlamydia trachomatis, an obligate intracellular bacterium, and describe modifications that significantly improve the utility of this approach. These techniques allow for rapid genomic analysis entirely in vitro, using DNA of limited quantity of purity.
Spacecraft Charging Calculations: NASCAP-2K and SEE Spacecraft Charging Handbook
NASA Technical Reports Server (NTRS)
Davis, V. A.; Neergaard, L. F.; Mandell, M. J.; Katz, I.; Gardner, B. M.; Hilton, J. M.; Minor, J.
2002-01-01
For fifteen years NASA and the Air Force Charging Analyzer Program for Geosynchronous Orbits (NASCAP/GEO) has been the workhorse of spacecraft charging calculations. Two new tools, the Space Environment and Effects (SEE) Spacecraft Charging Handbook (recently released), and Nascap-2K (under development), use improved numeric techniques and modern user interfaces to tackle the same problem. The SEE Spacecraft Charging Handbook provides first-order, lower-resolution solutions while Nascap-2K provides higher resolution results appropriate for detailed analysis. This paper illustrates how the improvements in the numeric techniques affect the results.
NASA Technical Reports Server (NTRS)
Capo, M. A.; Disney, R. K.
1971-01-01
The work performed in the following areas is summarized: (1) Analysis of Realistic nuclear-propelled vehicle was analyzed using the Marshall Space Flight Center computer code package. This code package includes one and two dimensional discrete ordinate transport, point kernel, and single scatter techniques, as well as cross section preparation and data processing codes, (2) Techniques were developed to improve the automated data transfer in the coupled computation method of the computer code package and improve the utilization of this code package on the Univac-1108 computer system. (3) The MSFC master data libraries were updated.
Uranium Detection - Technique Validation Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colletti, Lisa Michelle; Garduno, Katherine; Lujan, Elmer J.
As a LANL activity for DOE/NNSA in support of SHINE Medical Technologies™ ‘Accelerator Technology’ we have been investigating the application of UV-vis spectroscopy for uranium analysis in solution. While the technique has been developed specifically for sulfate solutions, the proposed SHINE target solutions, it can be adapted to a range of different solution matrixes. The FY15 work scope incorporated technical development that would improve accuracy, specificity, linearity & range, precision & ruggedness, and comparative analysis. Significant progress was achieved throughout FY 15 addressing these technical challenges, as is summarized in this report. In addition, comparative analysis of unknown samples usingmore » the Davies-Gray titration technique highlighted the importance of controlling temperature during analysis (impacting both technique accuracy and linearity/range). To fully understand the impact of temperature, additional experimentation and data analyses were performed during FY16. The results from this FY15/FY16 work were presented in a detailed presentation, LA-UR-16-21310, and an update of this presentation is included with this short report summarizing the key findings. The technique is based on analysis of the most intense U(VI) absorbance band in the visible region of the uranium spectra in 1 M H 2SO 4, at λ max = 419.5 nm.« less
Space crew radiation exposure analysis system based on a commercial stand-alone CAD system
NASA Technical Reports Server (NTRS)
Appleby, Matthew H.; Golightly, Michael J.; Hardy, Alva C.
1992-01-01
Major improvements have recently been completed in the approach to spacecraft shielding analysis. A Computer-Aided Design (CAD)-based system has been developed for determining the shielding provided to any point within or external to the spacecraft. Shielding analysis is performed using a commercially available stand-alone CAD system and a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design projects such as a Mars transfer habitat, pressurized lunar rover, and the redesigned Space Station. Results of these analyses are provided to demonstrate the applicability and versatility of the system.
Directed Incremental Symbolic Execution
NASA Technical Reports Server (NTRS)
Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz
2011-01-01
The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.
A Meta-Analytic Review of Stand-Alone Interventions to Improve Body Image
Alleva, Jessica M.; Sheeran, Paschal; Webb, Thomas L.; Martijn, Carolien; Miles, Eleanor
2015-01-01
Objective Numerous stand-alone interventions to improve body image have been developed. The present review used meta-analysis to estimate the effectiveness of such interventions, and to identify the specific change techniques that lead to improvement in body image. Methods The inclusion criteria were that (a) the intervention was stand-alone (i.e., solely focused on improving body image), (b) a control group was used, (c) participants were randomly assigned to conditions, and (d) at least one pretest and one posttest measure of body image was taken. Effect sizes were meta-analysed and moderator analyses were conducted. A taxonomy of 48 change techniques used in interventions targeted at body image was developed; all interventions were coded using this taxonomy. Results The literature search identified 62 tests of interventions (N = 3,846). Interventions produced a small-to-medium improvement in body image (d + = 0.38), a small-to-medium reduction in beauty ideal internalisation (d + = -0.37), and a large reduction in social comparison tendencies (d + = -0.72). However, the effect size for body image was inflated by bias both within and across studies, and was reliable but of small magnitude once corrections for bias were applied. Effect sizes for the other outcomes were no longer reliable once corrections for bias were applied. Several features of the sample, intervention, and methodology moderated intervention effects. Twelve change techniques were associated with improvements in body image, and three techniques were contra-indicated. Conclusions The findings show that interventions engender only small improvements in body image, and underline the need for large-scale, high-quality trials in this area. The review identifies effective techniques that could be deployed in future interventions. PMID:26418470
Short-Arc Analysis of Intersatellite Tracking Data in a Gravity Mapping Mission
NASA Technical Reports Server (NTRS)
Rowlands, David D.; Ray, Richard D.; Chinn, Douglas S.; Lemoine, Frank G.; Smith, David E. (Technical Monitor)
2001-01-01
A technique for the analysis of low-low intersatellite range-rate data in a gravity mapping mission is explored. The technique is based on standard tracking data analysis for orbit determination but uses a spherical coordinate representation of the 12 epoch state parameters describing the baseline between the two satellites. This representation of the state parameters is exploited to allow the intersatellite range-rate analysis to benefit from information provided by other tracking data types without large simultaneous multiple data type solutions. The technique appears especially valuable for estimating gravity from short arcs (e.g., less than 15 minutes) of data. Gravity recovery simulations which use short arcs are compared with those using arcs a day in length. For a high-inclination orbit, the short-arc analysis recovers low-order gravity coefficients remarkably well, although higher order terms, especially sectorial terms, are less accurate. Simulations suggest that either long or short arcs of GRACE data are likely to improve parts of the geopotential spectrum by orders of magnitude.
Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis
ERIC Educational Resources Information Center
Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.
2013-01-01
Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…
Chromatographic Techniques for Rare Earth Elements Analysis
NASA Astrophysics Data System (ADS)
Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin
2017-04-01
The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.
Modelling, design and stability analysis of an improved SEPIC converter for renewable energy systems
NASA Astrophysics Data System (ADS)
G, Dileep; Singh, S. N.; Singh, G. K.
2017-09-01
In this paper, a detailed modelling and analysis of a switched inductor (SI)-based improved single-ended primary inductor converter (SEPIC) has been presented. To increase the gain of conventional SEPIC converter, input and output side inductors are replaced with SI structures. Design and stability analysis for continuous conduction mode operation of the proposed SI-SEPIC converter has also been presented in this paper. State space averaging technique is used to model the converter and carry out the stability analysis. Performance and stability analysis of closed loop configuration is predicted by observing the open loop behaviour using Nyquist diagram and Nichols chart. System was found to stable and critically damped.
Liao, Shu-Hsien; Chen, Ming-Jye; Yang, Hong-Chang; Lee, Shin-Yi; Chen, Hsin-Hsien; Horng, Herng-Er; Yang, Shieh-Yueh
2010-10-01
In this paper, an instrumentation of the Earth's field nuclear magnetic resonance (EFNMR) inside a laboratory is presented. A lock-in analysis (LIA) technique was proposed to enhance the signal-to-noise ratio (SNR). A SNR of 137.8 was achieved in a single measurement for 9 ml tap water, and the LIA technique significantly enhanced the SNR to 188 after a 10-average in a noisy laboratory environment. The proton-phosphorus coupling in trimethyl phosphate ((CH(3)O)(3)PO) with J-coupling J[H,F]=(10.99±0.013) Hz has been demonstrated. The LIA technique improves the SNR, and a 2.6-fold improvement in SNR over that of the frequency-adjusted averaging is achieved. To reduce the noise in EFNMR, it was suggested that the LIA technique and the first order gradient shim be used to achieve a subhertz linewidth.
Varela, P; Silva, A; da Silva, F; da Graça, S; Manso, M E; Conway, G D
2010-10-01
The spectrogram is one of the best-known time-frequency distributions suitable to analyze signals whose energy varies both in time and frequency. In reflectometry, it has been used to obtain the frequency content of FM-CW signals for density profile inversion and also to study plasma density fluctuations from swept and fixed frequency data. Being implemented via the short-time Fourier transform, the spectrogram is limited in resolution, and for that reason several methods have been developed to overcome this problem. Among those, we focus on the reassigned spectrogram technique that is both easily automated and computationally efficient requiring only the calculation of two additional spectrograms. In each time-frequency window, the technique reallocates the spectrogram coordinates to the region that most contributes to the signal energy. The application to ASDEX Upgrade reflectometry data results in better energy concentration and improved localization of the spectral content of the reflected signals. When combined with the automatic (data driven) window length spectrogram, this technique provides improved profile accuracy, in particular, in regions where frequency content varies most rapidly such as the edge pedestal shoulder.
Wavelet processing techniques for digital mammography
NASA Astrophysics Data System (ADS)
Laine, Andrew F.; Song, Shuwu
1992-09-01
This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Similar to traditional coarse to fine matching strategies, the radiologist may first choose to look for coarse features (e.g., dominant mass) within low frequency levels of a wavelet transform and later examine finer features (e.g., microcalcifications) at higher frequency levels. In addition, features may be extracted by applying geometric constraints within each level of the transform. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet representations, enhanced by linear, exponential and constant weight functions through scale space. By improving the visualization of breast pathology we can improve the chances of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).
NASA Astrophysics Data System (ADS)
Paek, Seung Weon; Kang, Jae Hyun; Ha, Naya; Kim, Byung-Moo; Jang, Dae-Hyun; Jeon, Junsu; Kim, DaeWook; Chung, Kun Young; Yu, Sung-eun; Park, Joo Hyun; Bae, SangMin; Song, DongSup; Noh, WooYoung; Kim, YoungDuck; Song, HyunSeok; Choi, HungBok; Kim, Kee Sup; Choi, Kyu-Myung; Choi, Woonhyuk; Jeon, JoongWon; Lee, JinWoo; Kim, Ki-Su; Park, SeongHo; Chung, No-Young; Lee, KangDuck; Hong, YoungKi; Kim, BongSeok
2012-03-01
A set of design for manufacturing (DFM) techniques have been developed and applied to 45nm, 32nm and 28nm logic process technologies. A noble technology combined a number of potential confliction of DFM techniques into a comprehensive solution. These techniques work in three phases for design optimization and one phase for silicon diagnostics. In the DFM prevention phase, foundation IP such as standard cells, IO, and memory and P&R tech file are optimized. In the DFM solution phase, which happens during ECO step, auto fixing of process weak patterns and advanced RC extraction are performed. In the DFM polishing phase, post-layout tuning is done to improve manufacturability. DFM analysis enables prioritization of random and systematic failures. The DFM technique presented in this paper has been silicon-proven with three successful tape-outs in Samsung 32nm processes; about 5% improvement in yield was achieved without any notable side effects. Visual inspection of silicon also confirmed the positive effect of the DFM techniques.
2011-01-01
Cardiovascular magnetic resonance (CMR) tagging has been established as an essential technique for measuring regional myocardial function. It allows quantification of local intramyocardial motion measures, e.g. strain and strain rate. The invention of CMR tagging came in the late eighties, where the technique allowed for the first time for visualizing transmural myocardial movement without having to implant physical markers. This new idea opened the door for a series of developments and improvements that continue up to the present time. Different tagging techniques are currently available that are more extensive, improved, and sophisticated than they were twenty years ago. Each of these techniques has different versions for improved resolution, signal-to-noise ratio (SNR), scan time, anatomical coverage, three-dimensional capability, and image quality. The tagging techniques covered in this article can be broadly divided into two main categories: 1) Basic techniques, which include magnetization saturation, spatial modulation of magnetization (SPAMM), delay alternating with nutations for tailored excitation (DANTE), and complementary SPAMM (CSPAMM); and 2) Advanced techniques, which include harmonic phase (HARP), displacement encoding with stimulated echoes (DENSE), and strain encoding (SENC). Although most of these techniques were developed by separate groups and evolved from different backgrounds, they are in fact closely related to each other, and they can be interpreted from more than one perspective. Some of these techniques even followed parallel paths of developments, as illustrated in the article. As each technique has its own advantages, some efforts have been made to combine different techniques together for improved image quality or composite information acquisition. In this review, different developments in pulse sequences and related image processing techniques are described along with the necessities that led to their invention, which makes this article easy to read and the covered techniques easy to follow. Major studies that applied CMR tagging for studying myocardial mechanics are also summarized. Finally, the current article includes a plethora of ideas and techniques with over 300 references that motivate the reader to think about the future of CMR tagging. PMID:21798021
Etchepareborda, Pablo; Vadnjal, Ana Laura; Federico, Alejandro; Kaufmann, Guillermo H
2012-09-15
We evaluate the extension of the exact nonlinear reconstruction technique developed for digital holography to the phase-recovery problems presented by other optical interferometric methods, which use carrier modulation. It is shown that the introduction of an analytic wavelet analysis in the ridge of the cepstrum transformation corresponding to the analyzed interferogram can be closely related to the well-known wavelet analysis of the interferometric intensity. Subsequently, the phase-recovery process is improved. The advantages and limitations of this framework are analyzed and discussed using numerical simulations in singular scalar light fields and in temporal speckle pattern interferometry.
Overall equipment efficiency of Flexographic Printing process: A case study
NASA Astrophysics Data System (ADS)
Zahoor, S.; Shehzad, A.; Mufti, NA; Zahoor, Z.; Saeed, U.
2017-12-01
This paper reports the efficiency improvement of a flexographic printing machine by reducing breakdown time with the help of a total productive maintenance measure called overall equipment efficiency (OEE). The methodology is comprised of calculating OEE of the machine before and after identifying the causes of the problems. Pareto diagram is used to prioritize main problem areas and 5-whys analysis approach is used to identify the root cause of these problems. OEE of the process is improved from 34% to 40.2% for a 30 days time period. It is concluded that OEE and 5-whys analysis techniques are useful in improving effectiveness of the equipment and for the continuous process improvement as well.
Classroom Thought, Teacher Questions, and Student Analysis
ERIC Educational Resources Information Center
Wilen, William W.; Hogg, James
1976-01-01
Discussed is the need for teachers to improve their effectiveness in classroom skills such as questioning techniques. An instructor cognitive operation index is presented. For journal availability, see SO 505 192. (Author/DB)
Watkins, Robert G; Hanna, Robert; Chang, David; Watkins, Robert G
2014-07-01
Retrospective radiographic analysis. To determine which lumbar interbody technique is most effective for restoring lordosis, increasing disk height, and reducing spondylolisthesis. Lumbar interbody fusions are performed in hopes of increasing fusion potential, correcting deformity, and indirectly decompressing nerve roots. No published study has directly compared anterior, lateral, and transforaminal lumber interbody fusions in terms of ability to restore lordosis, increase disk height, and reduce spondylolisthesis. Lumbar interbody fusion techniques were retrospectively compared in terms of improvement of lordosis, disk height, and spondylolisthesis between preoperative and follow-up lateral radiographs. A total of 220 consecutive patients with 309 operative levels were compared by surgery type: anterior (184 levels), lateral (86 levels), and transforaminal (39 levels). Average follow-up was 19.2 months (range, 1-56 mo), with no statistical difference between the groups. Intragroup analysis showed that the anterior (4.5 degrees) and lateral (2.2 degrees) groups significantly improved lordosis from preoperative to follow-up, whereas the transforaminal (0.8 degrees) group did not. Intergroup analysis showed that the anterior group significantly improved lordosis more than both the lateral and transforaminal groups. The anterior (2.2 mm) and lateral (2.0 mm) groups both significantly improved disk height more than the transforaminal (0.5 mm) group. All 3 groups significantly reduced spondylolisthesis, with no difference between the groups. After lumbar interbody fusion, improvement of lordosis was significant for both the anterior and lateral groups, but not the transforaminal group. Intergroup analysis showed the anterior group had significantly improved lordosis compared to both the other groups. The anterior and lateral groups had significantly increased disk height compared to the transforaminal group. All the 3 groups significantly reduced spondylolisthesis, with no difference between the groups.
Performance Analysis of Receive Diversity in Wireless Sensor Networks over GBSBE Models
Goel, Shivali; Abawajy, Jemal H.; Kim, Tai-hoon
2010-01-01
Wireless sensor networks have attracted a lot of attention recently. In this paper, we develop a channel model based on the elliptical model for multipath components involving randomly placed scatterers in the scattering region with sensors deployed on a field. We verify that in a sensor network, the use of receive diversity techniques improves the performance of the system. Extensive performance analysis of the system is carried out for both single and multiple antennas with the applied receive diversity techniques. Performance analyses based on variations in receiver height, maximum multipath delay and transmit power have been performed considering different numbers of antenna elements present in the receiver array, Our results show that increasing the number of antenna elements for a wireless sensor network does indeed improve the BER rates that can be obtained. PMID:22163510
Molinari, Filippo; Rimini, Daniele; Liboni, William; Acharya, U Rajendra; Franzini, Marianno; Pandolfi, Sergio; Ricevuti, Giovanni; Vaiano, Francesco; Valdenassi, Luigi; Simonetti, Vincenzo
2017-08-01
Ozone major autohemotherapy is effective in reducing the symptoms of multiple sclerosis (MS) patients, but its effects on brain are still not clear. In this work, we have monitored the changes in the cerebrovascular pattern of MS patients and normal subjects during major ozone autohemotherapy by using near-infrared spectroscopy (NIRS) as functional and vascular technique. NIRS signals are analyzed using a combination of time, time-frequency analysis and nonlinear analysis of intrinsic mode function signals obtained from empirical mode decomposition technique. Our results show that there is an improvement in the cerebrovascular pattern of all subjects indicated by increasing the entropy of the NIRS signals. Hence, we can conclude that the ozone therapy increases the brain metabolism and helps to recover from the lower activity levels which is predominant in MS patients.
Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels.
Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R
2018-01-01
Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods.
SRB Environment Evaluation and Analysis. Volume 2: RSRB Joint Filling Test/Analysis Improvements
NASA Technical Reports Server (NTRS)
Knox, E. C.; Woods, G. Hamilton
1991-01-01
Following the Challenger accident a very comprehensive solid rocket booster (SRB) redesign program was initiated. One objective of the program was to develop expertise at NASA/MSFC in the techniques for analyzing the flow of hot gases in the SRB joints. Several test programs were undertaken to provide a data base of joint performance with manufactured defects in the joints to allow hot gases to fill the joints. This data base was used also to develop the analytical techniques. Some of the test programs were Joint Environment Simulator (JES), Nozzle Joint Environment Simulator (NJES), Transient Pressure Test Article (TPTA), and Seventy-Pound Charge (SPC). In 1988 the TPTA test hardware was moved from the Utah site to MSFC and several RSRM tests were scheduled, to be followed by tests for the ASRM program. REMTECH Inc. supported these activities with pretest estimates of the flow conditions in the test joints, and post-test analysis and evaluation of the measurements. During this support REMTECH identified deficiencies in the gas-measurement instrumentation that existed in the TPTA hardware, made recommendations for its replacement, and identified improvements to the analytical tools used in the test support. Only one test was completed under the TPTA RSRM test program, and those scheduled for the ASRM were rescheduled to a time after the expiration of this contract. The attention of this effort was directed toward improvements in the analytical techniques in preparation for when the ASRM program begins.
The endowment effect and WTA: a quasi-experimental test
H.F. MacDonald; J. Michael Bowker
1993-01-01
This paper reports a test of the endowment effect in an economic analysis of localized air pollution. Regression techniques are used to test the significance of perceived property rights on household WTP for improved air quality versus WTA compensation to forgo an improvement in air quality. Our experimental contributes to the research into WTP/WTA divergence by...
Improving Iranian High School Students' Reading Comprehension Using the Tenets of Genre Analysis
ERIC Educational Resources Information Center
Adelnia, Rezvan; Salehi, Hadi
2016-01-01
This study is an attempt to investigate impact of using a technique, namely, genre-based approach on improving reading ability on Iranian EFL learners' achievement. Therefore, an attempt was made to compare genre-based approach to teaching reading with traditional approaches. For achieving this purpose, by administering the Oxford Quick Placement…
ERIC Educational Resources Information Center
Wulf, Kathleen M.; And Others
1980-01-01
An analysis of the massive amount of literature pertaining to the improvement of professional instruction in dental education resulted in the formation of a comprehensive model of 10 categories, including Delphi technique; systems approach; agencies; workshops; multi-media, self-instruction; evaluation paradigms, measurement, courses, and…
ERIC Educational Resources Information Center
Bigham, Gary D.; Riney, Mark R.
2017-01-01
To meet the constantly changing needs of schools and diverse learners, educators must frequently monitor student learning, revise curricula, and improve instruction. Consequently, it is critical that careful analyses of student performance data are ongoing components of curriculum decision-making processes. The primary purpose of this study is to…
Effective Report Preparation: Streamlining the Reporting Process. AIR 1999 Annual Forum Paper.
ERIC Educational Resources Information Center
Dalrymple, Margaret; Wang, Mindy; Frost, Jacquelyn
This paper describes the processes and techniques used to improve and streamline the standard student reports used at Purdue University (Indiana). Various models for analyzing reporting processes are described, especially the model used in the study, the Shewart or Deming Cycle, a method that aids in continuous analysis and improvement through a…
The Virtual Genetics Lab II: Improvements to a Freely Available Software Simulation of Genetics
ERIC Educational Resources Information Center
White, Brian T.
2012-01-01
The Virtual Genetics Lab II (VGLII) is an improved version of the highly successful genetics simulation software, the Virtual Genetics Lab (VGL). The software allows students to use the techniques of genetic analysis to design crosses and interpret data to solve realistic genetics problems involving a hypothetical diploid insect. This is a brief…
Nuevos aspectos en el estudio de la particula D en el experimento FOCUS de Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinones Gonzalez, Jose A.; /Puerto Rico U., Mayaguez
The purpose of this work is to improve the reconstruction techniques of the decays of the particles that contain charm in the quark composition using the information of the Target Silicon Detector of the experiment E831 (FOCUS). That experiment runs during 1997 to 1998 in Fermilab National Laboratory. The objective of the experiment was improving the understanding of the particles that contain charm. Adding the Target Silicon Detector information in the reconstruction process of the primary vertex the position error. This reduction produces an improvement in the mass signal and the knowledge of the charm particles properties. This ad tomore » the possibility's that in other analysis will use the techniques developed in this work.« less
New Techniques for the Generation and Analysis of Tailored Microbial Systems on Surfaces.
Furst, Ariel L; Smith, Matthew J; Francis, Matthew B
2018-05-17
The interactions between microbes and surfaces provide critically important cues that control the behavior and growth of the cells. As our understanding of complex microbial communities improves, there is a growing need for experimental tools that can establish and control the spatial arrangements of these cells in a range of contexts. Recent improvements in methods to attach bacteria and yeast to nonbiological substrates, combined with an expanding set of techniques available to study these cells, position this field for many new discoveries. Improving methods for controlling the immobilization of bacteria provides powerful experimental tools for testing hypotheses regarding microbiome interactions, studying the transfer of nutrients between bacterial species, and developing microbial communities for green energy production and pollution remediation.
Video-assisted structured teaching to improve aseptic technique during neuraxial block.
Friedman, Z; Siddiqui, N; Mahmoud, S; Davies, S
2013-09-01
Teaching epidural catheter insertion tends to focus on developing manual dexterity rather than improving aseptic technique which usually remains poor despite increasing experience. The aim of this study was to compare epidural aseptic technique performance, by novice operators after a targeted teaching intervention, with operators taught aseptic technique before the intervention was initiated. Starting July 2008, two groups of second-year anaesthesia residents (pre- and post-teaching intervention) performing their 4-month obstetric anaesthesia rotation in a university affiliated centre were videotaped three to four times while performing epidural procedures. Trained blinded independent examiners reviewed the procedures. The primary outcome was a comparison of aseptic technique performance scores (0-30 points) graded on a scale task-specific checklist. A total of 86 sessions by 29 residents were included in the study analysis. The intraclass correlation coefficient for inter-rater reliability for the aseptic technique was 0.90. The median aseptic technique scores for the rotation period were significantly higher in the post-intervention group [27.58, inter-quartile range (IQR) 22.33-29.50 vs 16.56, IQR 13.33-22.00]. Similar results were demonstrated when scores were analysed for low, moderate, and high levels of experience throughout the rotation. Procedure-specific aseptic technique teaching, aided by video assessment and video demonstration, helped significantly improve aseptic practice by novice trainees. Future studies should consider looking at retention over longer periods of time in more senior residents.
Exploring relation types for literature-based discovery.
Preiss, Judita; Stevenson, Mark; Gaizauskas, Robert
2015-09-01
Literature-based discovery (LBD) aims to identify "hidden knowledge" in the medical literature by: (1) analyzing documents to identify pairs of explicitly related concepts (terms), then (2) hypothesizing novel relations between pairs of unrelated concepts that are implicitly related via a shared concept to which both are explicitly related. Many LBD approaches use simple techniques to identify semantically weak relations between concepts, for example, document co-occurrence. These generate huge numbers of hypotheses, difficult for humans to assess. More complex techniques rely on linguistic analysis, for example, shallow parsing, to identify semantically stronger relations. Such approaches generate fewer hypotheses, but may miss hidden knowledge. The authors investigate this trade-off in detail, comparing techniques for identifying related concepts to discover which are most suitable for LBD. A generic LBD system that can utilize a range of relation types was developed. Experiments were carried out comparing a number of techniques for identifying relations. Two approaches were used for evaluation: replication of existing discoveries and the "time slicing" approach.(1) RESULTS: Previous LBD discoveries could be replicated using relations based either on document co-occurrence or linguistic analysis. Using relations based on linguistic analysis generated many fewer hypotheses, but a significantly greater proportion of them were candidates for hidden knowledge. The use of linguistic analysis-based relations improves accuracy of LBD without overly damaging coverage. LBD systems often generate huge numbers of hypotheses, which are infeasible to manually review. Improving their accuracy has the potential to make these systems significantly more usable. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Kumar, B Santhosh; Sandhyamani, S; Nazeer, Shaiju S; Jayasree, R S
2015-02-01
Autofluorescence exhibited by tissues often interferes with immunofluorescence. Using imaging and spectral analysis, we observed remarkable reduction of autofluorescence of formalin fixed paraffin embedded tissues irradiated with light prior to incubation with immunofluorescent dyes. The technique of photobleaching offers significant improvement in the quality and specificity of immunofluorescence. This has the potential for better techniques for disease diagnosis.
Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique
NASA Astrophysics Data System (ADS)
Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.
2017-10-01
At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.
NASA Astrophysics Data System (ADS)
Uma Maheswari, R.; Umamaheswari, R.
2017-02-01
Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.
Integrating Kano’s Model into Quality Function Deployment for Product Design: A Comprehensive Review
NASA Astrophysics Data System (ADS)
Ginting, Rosnani; Hidayati, Juliza; Siregar, Ikhsan
2018-03-01
Many methods and techniques are adopted by some companies to improve the competitiveness through the fulfillment of customer satisfaction by enhancement and improvement the product design quality. Over the past few years, several researcher have studied extensively combining Quality Function Deployment and Kano’s model as design techniques by focusing on translating consumer desires into a product design. This paper presents a review and analysis of several literatures that associated to the integration methodology of Kano into the QFD process. Various of international journal articles were selected, collected and analyzed through a number of relevant scientific publications. In-depth analysis was performed, and focused in this paper on the results, advantages and drawbacks of its methodology. In addition, this paper also provides the analysis that acquired in this study related to the development of the methodology. It is hopedd this paper can be a reference for other researchers and manufacturing companies to implement the integration method of QFD- Kano for product design.
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
ERIC Educational Resources Information Center
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
ERIC Educational Resources Information Center
Embrey, Karen K.
2012-01-01
Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…
Technique for ranking potential predictor layers for use in remote sensing analysis
Andrew Lister; Mike Hoppus; Rachel Riemann
2004-01-01
Spatial modeling using GIS-based predictor layers often requires that extraneous predictors be culled before conducting analysis. In some cases, using extraneous predictor layers might improve model accuracy but at the expense of increasing complexity and interpretability. In other cases, using extraneous layers can dilute the relationship between predictors and target...
Economic Analysis of Education: A Conceptual Framework. Theoretical Paper No. 68.
ERIC Educational Resources Information Center
Rossmiller, Richard A.; Geske, Terry G.
This paper discusses several concepts and techniques from the areas of systems theory and economic analysis that can be used as tools in an effort to improve the productivity of the educational enterprise. Several studies investigating productivity in education are reviewed, and the analytical problems in conducting cost-effectiveness studies are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1976-08-05
During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)
NASA Astrophysics Data System (ADS)
Sivasubramaniam, Kiruba
This thesis makes advances in three dimensional finite element analysis of electrical machines and the quantification of their parameters and performance. The principal objectives of the thesis are: (1)the development of a stable and accurate method of nonlinear three-dimensional field computation and application to electrical machinery and devices; and (2)improvement in the accuracy of determination of performance parameters, particularly forces and torque computed from finite elements. Contributions are made in two general areas: a more efficient formulation for three dimensional finite element analysis which saves time and improves accuracy, and new post-processing techniques to calculate flux density values from a given finite element solution. A novel three-dimensional magnetostatic solution based on a modified scalar potential method is implemented. This method has significant advantages over the traditional total scalar, reduced scalar or vector potential methods. The new method is applied to a 3D geometry of an iron core inductor and a permanent magnet motor. The results obtained are compared with those obtained from traditional methods, in terms of accuracy and speed of computation. A technique which has been observed to improve force computation in two dimensional analysis using a local solution of Laplace's equation in the airgap of machines is investigated and a similar method is implemented in the three dimensional analysis of electromagnetic devices. A new integral formulation to improve force calculation from a smoother flux-density profile is also explored and implemented. Comparisons are made and conclusions drawn as to how much improvement is obtained and at what cost. This thesis also demonstrates the use of finite element analysis to analyze torque ripples due to rotor eccentricity in permanent magnet BLDC motors. A new method for analyzing torque harmonics based on data obtained from a time stepping finite element analysis of the machine is explored and implemented.
Random safety auditing, root cause analysis, failure mode and effects analysis.
Ursprung, Robert; Gray, James
2010-03-01
Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.
An analysis of pilot error-related aircraft accidents
NASA Technical Reports Server (NTRS)
Kowalsky, N. B.; Masters, R. L.; Stone, R. B.; Babcock, G. L.; Rypka, E. W.
1974-01-01
A multidisciplinary team approach to pilot error-related U.S. air carrier jet aircraft accident investigation records successfully reclaimed hidden human error information not shown in statistical studies. New analytic techniques were developed and applied to the data to discover and identify multiple elements of commonality and shared characteristics within this group of accidents. Three techniques of analysis were used: Critical element analysis, which demonstrated the importance of a subjective qualitative approach to raw accident data and surfaced information heretofore unavailable. Cluster analysis, which was an exploratory research tool that will lead to increased understanding and improved organization of facts, the discovery of new meaning in large data sets, and the generation of explanatory hypotheses. Pattern recognition, by which accidents can be categorized by pattern conformity after critical element identification by cluster analysis.
Program Analyzes Radar Altimeter Data
NASA Technical Reports Server (NTRS)
Vandemark, Doug; Hancock, David; Tran, Ngan
2004-01-01
A computer program has been written to perform several analyses of radar altimeter data. The program was designed to improve on previous methods of analysis of altimeter engineering data by (1) facilitating and accelerating the analysis of large amounts of data in a more direct manner and (2) improving the ability to estimate performance of radar-altimeter instrumentation and provide data corrections. The data in question are openly available to the international scientific community and can be downloaded from anonymous file-transfer- protocol (FTP) locations that are accessible via links from altimetry Web sites. The software estimates noise in range measurements, estimates corrections for electromagnetic bias, and performs statistical analyses on various parameters for comparison of different altimeters. Whereas prior techniques used to perform similar analyses of altimeter range noise require comparison of data from repetitions of satellite ground tracks, the present software uses a high-pass filtering technique to obtain similar results from single satellite passes. Elimination of the requirement for repeat-track analysis facilitates the analysis of large amounts of satellite data to assess subtle variations in range noise.
Esaka, Fumitaka; Magara, Masaaki; Suzuki, Daisuke; Miyamoto, Yutaka; Lee, Chi-Gyu; Kimura, Takaumi
2010-12-15
Information on plutonium isotope ratios in individual particles is of great importance for nuclear safeguards, nuclear forensics and so on. Although secondary ion mass spectrometry (SIMS) is successfully utilized for the analysis of individual uranium particles, the isobaric interference of americium-241 to plutonium-241 makes difficult to obtain accurate isotope ratios in individual plutonium particles. In the present work, an analytical technique by a combination of chemical separation and inductively coupled plasma mass spectrometry (ICP-MS) is developed and applied to isotope ratio analysis of individual sub-micrometer plutonium particles. The ICP-MS results for individual plutonium particles prepared from a standard reference material (NBL SRM-947) indicate that the use of a desolvation system for sample introduction improves the precision of isotope ratios. In addition, the accuracy of the (241)Pu/(239)Pu isotope ratio is much improved, owing to the chemical separation of plutonium and americium. In conclusion, the performance of the proposed ICP-MS technique is sufficient for the analysis of individual plutonium particles. Copyright © 2010 Elsevier B.V. All rights reserved.
A quality improvement management model for renal care.
Vlchek, D L; Day, L M
1991-04-01
The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.
The recalibration of the IUE scientific instrument
NASA Technical Reports Server (NTRS)
Imhoff, Catherine L.; Oliversen, Nancy A.; Nichols-Bohlin, Joy; Casatella, Angelo; Lloyd, Christopher
1988-01-01
The IUE instrument was recalibrated because of long time-scale changes in the scientific instrument, a better understanding of the performance of the instrument, improved sets of calibration data, and improved analysis techniques. Calibrations completed or planned include intensity transfer functions (ITF), low-dispersion absolute calibrations, high-dispersion ripple corrections and absolute calibrations, improved geometric mapping of the ITFs to spectral images, studies to improve the signal-to-noise, enhanced absolute calibrations employing corrections for time, temperature, and aperture dependence, and photometric and geometric calibrations for the FES.
van Mourik, Louise M; Leonards, Pim E G; Gaus, Caroline; de Boer, Jacob
2015-10-01
Concerns about the high production volumes, persistency, bioaccumulation potential and toxicity of chlorinated paraffin (CP) mixtures, especially short-chain CPs (SCCPs), are rising. However, information on their levels and fate in the environment is still insufficient, impeding international classifications and regulations. This knowledge gap is mainly due to the difficulties that arise with CP analysis, in particular the chromatographic separation within CPs and between CPs and other compounds. No fully validated routine analytical method is available yet and only semi-quantitative analysis is possible, although the number of studies reporting new and improved methods have rapidly increased since 2010. Better cleanup procedures that remove interfering compounds, and new instrumental techniques, which distinguish between medium-chain CPs (MCCPs) and SCCPs, have been developed. While gas chromatography coupled to an electron capture negative ionisation mass spectrometry (GC/ECNI-MS) remains the most commonly applied technique, novel and promising use of high resolution time of flight MS (TOF-MS) has also been reported. We expect that recent developments in high resolution TOF-MS and Orbitrap technologies will further improve the detection of CPs, including long-chain CPs (LCCPs), and the group separation and quantification of CP homologues. Also, new CP quantification methods have emerged, including the use of mathematical algorithms, multiple linear regression and principal component analysis. These quantification advancements are also reflected in considerably improved interlaboratory agreements since 2010. Analysis of lower chlorinated paraffins (
NASA Astrophysics Data System (ADS)
Aida, S.; Matsuno, T.; Hasegawa, T.; Tsuji, K.
2017-07-01
Micro X-ray fluorescence (micro-XRF) analysis is repeated as a means of producing elemental maps. In some cases, however, the XRF images of trace elements that are obtained are not clear due to high background intensity. To solve this problem, we applied principal component analysis (PCA) to XRF spectra. We focused on improving the quality of XRF images by applying PCA. XRF images of the dried residue of standard solution on the glass substrate were taken. The XRF intensities for the dried residue were analyzed before and after PCA. Standard deviations of XRF intensities in the PCA-filtered images were improved, leading to clear contrast of the images. This improvement of the XRF images was effective in cases where the XRF intensity was weak.
Super-resolution mapping using multi-viewing CHRIS/PROBA data
NASA Astrophysics Data System (ADS)
Dwivedi, Manish; Kumar, Vinay
2016-04-01
High-spatial resolution Remote Sensing (RS) data provides detailed information which ensures high-definition visual image analysis of earth surface features. These data sets also support improved information extraction capabilities at a fine scale. In order to improve the spatial resolution of coarser resolution RS data, the Super Resolution Reconstruction (SRR) technique has become widely acknowledged which focused on multi-angular image sequences. In this study multi-angle CHRIS/PROBA data of Kutch area is used for SR image reconstruction to enhance the spatial resolution from 18 m to 6m in the hope to obtain a better land cover classification. Various SR approaches like Projection onto Convex Sets (POCS), Robust, Iterative Back Projection (IBP), Non-Uniform Interpolation and Structure-Adaptive Normalized Convolution (SANC) chosen for this study. Subjective assessment through visual interpretation shows substantial improvement in land cover details. Quantitative measures including peak signal to noise ratio and structural similarity are used for the evaluation of the image quality. It was observed that SANC SR technique using Vandewalle algorithm for the low resolution image registration outperformed the other techniques. After that SVM based classifier is used for the classification of SRR and data resampled to 6m spatial resolution using bi-cubic interpolation. A comparative analysis is carried out between classified data of bicubic interpolated and SR derived images of CHRIS/PROBA and SR derived classified data have shown a significant improvement of 10-12% in the overall accuracy. The results demonstrated that SR methods is able to improve spatial detail of multi-angle images as well as the classification accuracy.
Backscattering analysis of high frequency ultrasonic imaging for ultrasound-guided breast biopsy
NASA Astrophysics Data System (ADS)
Cummins, Thomas; Akiyama, Takahiro; Lee, Changyang; Martin, Sue E.; Shung, K. Kirk
2017-03-01
A new ultrasound-guided breast biopsy technique is proposed. The technique utilizes conventional ultrasound guidance coupled with a high frequency embedded ultrasound array located within the biopsy needle to improve the accuracy in breast cancer diagnosis.1 The array within the needle is intended to be used to detect micro- calcifications indicative of early breast cancers such as ductal carcinoma in situ (DCIS). Backscattering analysis has the potential to characterize tissues to improve localization of lesions. This paper describes initial results of the application of backscattering analysis of breast biopsy tissue specimens and shows the usefulness of high frequency ultrasound for the new biopsy related technique. Ultrasound echoes of ex-vivo breast biopsy tissue specimens were acquired by using a single-element transducer with a bandwidth from 41 MHz to 88 MHz utilizing a UBM methodology, and the backscattering coefficients were calculated. These values as well as B-mode image data were mapped in 2D and matched with each pathology image for the identification of tissue type for the comparison to the pathology images corresponding to each plane. Microcalcifications were significantly distinguished from normal tissue. Adenocarcinoma was also successfully differentiated from adipose tissue. These results indicate that backscattering analysis is able to quantitatively distinguish tissues into normal and abnormal, which should help radiologists locate abnormal areas during the proposed ultrasound-guided breast biopsy with high frequency ultrasound.
Hassebo, Yasser Y; Gross, Barry; Oo, Min; Moshary, Fred; Ahmed, Samir
2006-08-01
The impact and potential of a polarization-selection technique to reduce the sky background signal for linearly polarized monostatic elastic backscatter lidar measurements are examined. Taking advantage of naturally occurring polarization properties in scattered skylight, we devised a polarization-discrimination technique in which both the lidar transmitter and the receiver track and minimize detected sky background noise while maintaining maximum lidar signal throughput. Lidar elastic backscatter measurements, carried out continuously during daylight hours at 532 nm, show as much as a factor of square root 10 improvement in the signal-to-noise ratio (SNR) over conventional unpolarized schemes. For vertically pointing lidars, the largest improvements are limited to the early morning and late afternoon hours, while for lidars scanning azimuthally and in elevation at angles other than vertical, significant improvements are achievable over more extended time periods with the specific times and improvement factors depending on the specific angle between the lidar and the solar axes. The resulting diurnal variations in SNR improvement sometimes show an asymmetry with the solar angle that analysis indicates can be attributed to changes in observed relative humidity that modifies the underlying aerosol microphysics and observed optical depth.
NASA Astrophysics Data System (ADS)
Hassebo, Yasser Y.; Gross, Barry; Oo, Min; Moshary, Fred; Ahmed, Samir
2006-08-01
The impact and potential of a polarization-selection technique to reduce the sky background signal for linearly polarized monostatic elastic backscatter lidar measurements are examined. Taking advantage of naturally occurring polarization properties in scattered skylight, we devised a polarization-discrimination technique in which both the lidar transmitter and the receiver track and minimize detected sky background noise while maintaining maximum lidar signal throughput. Lidar elastic backscatter measurements, carried out continuously during daylight hours at 532 nm, show as much as a factor of square root 10 improvement in the signal-to-noise ratio (SNR) over conventional unpolarized schemes. For vertically pointing lidars, the largest improvements are limited to the early morning and late afternoon hours, while for lidars scanning azimuthally and in elevation at angles other than vertical, significant improvements are achievable over more extended time periods with the specific times and improvement factors depending on the specific angle between the lidar and the solar axes. The resulting diurnal variations in SNR improvement sometimes show an asymmetry with the solar angle that analysis indicates can be attributed to changes in observed relative humidity that modifies the underlying aerosol microphysics and observed optical depth.
NASA Astrophysics Data System (ADS)
Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi
2018-04-01
Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.
van Oorsouw, Wietske M W J; Embregts, Petri J C M; Bosman, Anna M T; Jahoda, Andrew
2009-01-01
The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff training that are related to improvements of staff behaviour. Our literature search concentrated on studies that were published in a period of 20 years. Fifty-five studies met the criteria, resulting in 502 single-subject designs and 13 n>1 designs. Results revealed important information relevant to further improvement of clinical practice: (a) the combination of in-service with coaching-on-the-job is the most powerful format, (b) in in-service formats, one should use multiple techniques, and verbal feedback is particularly recommended, and (c) in coaching-on-the-job formats, verbal feedback should be part of the program, as well as praise and correction. To maximize effectiveness, program developers should carefully prepare training goals, training format, and training techniques, which will yield a profit for clinical practice.
The Animism Controversy Revisited: A Probability Analysis
ERIC Educational Resources Information Center
Smeets, Paul M.
1973-01-01
Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)
Visual cluster analysis and pattern recognition methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
2001-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
Environmental Justice Challengers for Ecosystem Service Valuation
In pursuing improved ecosystem services management, there is also an opportunity to work towards environmental justice. The practice of environmental valuation can assist with both goals, but as typically employed obscures distributional analysis. Furthermore, valuation technique...
Engineering Analysis of Stresses in Railroad Rails.
DOT National Transportation Integrated Search
1981-10-01
One portion of the Federal Railroad Administration's (FRA) Track Performance Improvement Program is the development of engineering and analytic techniques required for the design and maintenance of railroad track of increased integrity and safety. Un...
A New Computational Framework for Atmospheric and Surface Remote Sensing
NASA Technical Reports Server (NTRS)
Timucin, Dogan A.
2004-01-01
A Bayesian data-analysis framework is described for atmospheric and surface retrievals from remotely-sensed hyper-spectral data. Some computational techniques are high- lighted for improved accuracy in the forward physics model.
Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan
2015-06-01
Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.
Comparison of Two Variants Of a Kata Technique (Unsu): The Neuromechanical Point of View
Camomilla, Valentina; Sbriccoli, Paola; Mario, Alberto Di; Arpante, Alessandro; Felici, Francesco
2009-01-01
The objective of this work was to characterize from a neuromechanical point of view a jump performed within the sequence of Kata Unsu in International top level karateka. A modified jumping technique was proposed to improve the already acquired technique. The neuromechanical evaluation, paralleled by a refereeing judgment, was then used to compare modified and classic technique to test if the modification could lead to a better performance capacity, e.g. a higher score during an official competition. To this purpose, four high ranked karateka were recruited and instructed to perform the two jumps. Surface electromyographic signals were recorded in a bipolar mode from the vastus lateralis, rectus femoris, biceps femoris, gluteus maximus, and gastrocnemious muscles of both lower limbs. Mechanical data were collected by means of a stereophotogrammetric system and force platforms. Performance was associated to parameters characterizing the initial conditions of the aerial phase and to the CoM maximal height. The most critical elements having a negative influence on the arbitral evaluation were associated to quantitative error indicators. 3D reconstruction of the movement and videos were used to obtain the referee scores. The Unsu jump was divided into five phases (preparation, take off, ascending flight, descending flight, and landing) and the critical elements were highlighted. When comparing the techniques, no difference was found in the pattern of sEMG activation of the throwing leg muscles, while the push leg showed an earlier activation of RF and GA muscles at the beginning of the modified technique. The only significant improvement associated with the modified technique was evidenced at the beginning of the aerial phase, while there was no significant improvement of the referee score. Nevertheless, the proposed neuromechanical analysis, finalized to correlate technique features with the core performance indicators, is new in the field and is a promising tool to perform further analyses. Key Points A quantitative phase analysis, highlighting the critical features of the technique, was provided for the jump executed during the Kata Unsu. Kinematics and neuromuscular activity can be assessed during the Kata Unsu jump performed by top level karateka. Neuromechanical parameters change during different Kata Unsu jump techniques. Appropriate performance capacity indicators based on the neuromechanical evaluation can describe changes due to a modification of the technique. PMID:24474884
Functional results in airflow improvement using a "flip-flap" alar technique: our experience.
Di Stadio, Arianna; Macro, Carlo
Pinched nasal point can be arising as congenital malformation or as results of unsuccessfully surgery. The nasal valve alteration due to this problem is not only an esthetic problem but also a functional one because can modify the nasal airflow. Several surgical techniques were proposed in literature, we proposed our. The purpose of the study is the evaluation of nose airway flow using our flip-flap technique for correction of pinched nasal tip. This is a retrospective study conducted on twelve patients. Tip cartilages were remodeled by means of autologous alar cartilage grafting. The patients underwent a rhinomanometry pre and post-surgery to evaluate the results, and they performed a self-survey to evaluate their degree of satisfaction in term of airflow sensation improvement. Rhinomanometry showed improved nasal air flow (range from 25% to 75%) in all patients. No significant differences were showed between unilateral and bilateral alar malformation (p=0.49). Patient's satisfaction reached the 87.5%. Our analysis on the combined results (rhinomanometry and surveys) showed that this technique leads to improvement of nasal flow in patients affected by pinched nasal tip in all cases. Copyright © 2017 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.
Advances in Mid-Infrared Spectroscopy for Chemical Analysis
NASA Astrophysics Data System (ADS)
Haas, Julian; Mizaikoff, Boris
2016-06-01
Infrared spectroscopy in the 3-20 μm spectral window has evolved from a routine laboratory technique into a state-of-the-art spectroscopy and sensing tool by benefitting from recent progress in increasingly sophisticated spectra acquisition techniques and advanced materials for generating, guiding, and detecting mid-infrared (MIR) radiation. Today, MIR spectroscopy provides molecular information with trace to ultratrace sensitivity, fast data acquisition rates, and high spectral resolution catering to demanding applications in bioanalytics, for example, and to improved routine analysis. In addition to advances in miniaturized device technology without sacrificing analytical performance, selected innovative applications for MIR spectroscopy ranging from process analysis to biotechnology and medical diagnostics are highlighted in this review.
Toxic release consequence analysis tool (TORCAT) for inherently safer design plant.
Shariff, Azmi Mohd; Zaini, Dzulkarnain
2010-10-15
Many major accidents due to toxic release in the past have caused many fatalities such as the tragedy of MIC release in Bhopal, India (1984). One of the approaches is to use inherently safer design technique that utilizes inherent safety principle to eliminate or minimize accidents rather than to control the hazard. This technique is best implemented in preliminary design stage where the consequence of toxic release can be evaluated and necessary design improvements can be implemented to eliminate or minimize the accidents to as low as reasonably practicable (ALARP) without resorting to costly protective system. However, currently there is no commercial tool available that has such capability. This paper reports on the preliminary findings on the development of a prototype tool for consequence analysis and design improvement via inherent safety principle by utilizing an integrated process design simulator with toxic release consequence analysis model. The consequence analysis based on the worst-case scenarios during process flowsheeting stage were conducted as case studies. The preliminary finding shows that toxic release consequences analysis tool (TORCAT) has capability to eliminate or minimize the potential toxic release accidents by adopting the inherent safety principle early in preliminary design stage. 2010 Elsevier B.V. All rights reserved.
Statistical Symbolic Execution with Informed Sampling
NASA Technical Reports Server (NTRS)
Filieri, Antonio; Pasareanu, Corina S.; Visser, Willem; Geldenhuys, Jaco
2014-01-01
Symbolic execution techniques have been proposed recently for the probabilistic analysis of programs. These techniques seek to quantify the likelihood of reaching program events of interest, e.g., assert violations. They have many promising applications but have scalability issues due to high computational demand. To address this challenge, we propose a statistical symbolic execution technique that performs Monte Carlo sampling of the symbolic program paths and uses the obtained information for Bayesian estimation and hypothesis testing with respect to the probability of reaching the target events. To speed up the convergence of the statistical analysis, we propose Informed Sampling, an iterative symbolic execution that first explores the paths that have high statistical significance, prunes them from the state space and guides the execution towards less likely paths. The technique combines Bayesian estimation with a partial exact analysis for the pruned paths leading to provably improved convergence of the statistical analysis. We have implemented statistical symbolic execution with in- formed sampling in the Symbolic PathFinder tool. We show experimentally that the informed sampling obtains more precise results and converges faster than a purely statistical analysis and may also be more efficient than an exact symbolic analysis. When the latter does not terminate symbolic execution with informed sampling can give meaningful results under the same time and memory limits.
Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John
2013-05-01
Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.
Protein purification and analysis: next generation Western blotting techniques.
Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V
2017-11-01
Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.
Use of communication techniques by Maryland dentists.
Maybury, Catherine; Horowitz, Alice M; Wang, Min Qi; Kleinman, Dushanka V
2013-12-01
Health care providers' use of recommended communication techniques can increase patients' adherence to prevention and treatment regimens and improve patient health outcomes. The authors conducted a survey of Maryland dentists to determine the number and type of communication techniques they use on a routine basis. The authors mailed a 30-item questionnaire to a random sample of 1,393 general practice dentists and all 169 members of the Maryland chapter of the American Academy of Pediatric Dentistry. The overall response rate was 38.4 percent. Analysis included descriptive statistics, analysis of variance and ordinary least squares regression analysis to examine the association of dentists' characteristics with the number of communication techniques used. They set the significance level at P < .05. General dentists reported routinely using a mean of 7.9 of the 18 communication techniques and 3.6 of the seven basic techniques, whereas pediatric dentists reported using a mean of 8.4 and 3.8 of those techniques, respectively. General dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .01) but not the seven basic techniques (P < .05). Pediatric dentists who had taken a communication course outside of dental school were more likely than those who had not to use the 18 techniques (P < .05) and the seven basic techniques (P < .01). The number of communication techniques that dentists used routinely varied across the 18 techniques and was low for most techniques. Practical Implications. Professional education is needed both in dental school curricula and continuing education courses to increase use of recommended communication techniques. Specifically, dentists and their team members should consider taking communication skills courses and conducting an overall evaluation of their practices for user friendliness.
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Everett, W.R.; Rechnitz, G.A.
1999-01-01
A mini review of enzyme-based electrochemical biosensors for inhibition analysis of organophosphorus and carbamate pesticides is presented. Discussion includes the most recent literature to present advances in detection limits, selectivity and real sample analysis. Recent reviews on the monitoring of pesticides and their residues suggest that the classical analytical techniques of gas and liquid chromatography are the most widely used methods of detection. These techniques, although very accurate in their determinations, can be quite time consuming and expensive and usually require extensive sample clean up and pro-concentration. For these and many other reasons, the classical techniques are very difficult tomore » adapt for field use. Numerous researchers, in the past decade, have developed and made improvements on biosensors for use in pesticide analysis. This mini review will focus on recent advances made in enzyme-based electrochemical biosensors for the determinations of organophosphorus and carbamate pesticides.« less
NASA Astrophysics Data System (ADS)
Dontu, S.; Miclos, S.; Savastru, D.; Tautan, M.
2017-09-01
In recent years many optoelectronic techniques have been developed for improvement and the development of devices for tissue analysis. Spectral-Domain Optical Coherence Tomography (SD-OCT) is a new medical interferometric imaging modality that provides depth resolved tissue structure information with resolution in the μm range. However, SD-OCT has its own limitations and cannot offer the biochemical information of the tissue. These data can be obtained with hyperspectral imaging, a non-invasive, sensitive and real time technique. In the present study we have combined Spectral-Domain Optical Coherence Tomography (SD-OCT) with Hyperspectral imaging (HSI) for tissue analysis. The Spectral-Domain Optical Coherence Tomography (SD-OCT) and Hyperspectral imaging (HSI) are two methods that have demonstrated significant potential in this context. Preliminary results using different tissue have highlighted the capabilities of this technique of combinations.
Analysis of Hospital Processes with Process Mining Techniques.
Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises
2015-01-01
Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
A combined Bodian-Nissl stain for improved network analysis in neuronal cell culture.
Hightower, M; Gross, G W
1985-11-01
Bodian and Nissl procedures were combined to stain dissociated mouse spinal cord cells cultured on coverslips. The Bodian technique stains fine neuronal processes in great detail as well as an intracellular fibrillar network concentrated around the nucleus and in proximal neurites. The Nissl stain clearly delimits neuronal cytoplasm in somata and in large dendrites. A combination of these techniques allows the simultaneous depiction of neuronal perikarya and all afferent and efferent processes. Costaining with little background staining by either procedure suggests high specificity for neurons. This procedure could be exploited for routine network analysis of cultured neurons.
2014-01-01
Background Aseptic technique and handwashing have been shown to be important factors in perioperative bacterial transmission, however compliance often remains low despite guidelines and educational programs. Infectious complications of neuraxial (epidural and spinal) anesthesia are severe but fortunately rare. We conducted a survey to assess aseptic technique practices for neuraxial anesthesia in Israel before and after publication of international guidelines (which focused on handwashing, jewelry/watch removal and the wearing of a mask and cap). Methods The sampling frame was the general anesthesiology workforce in hospitals selected from each of the four medical faculties in Israel. Data was collected anonymously over one week in each hospital in two periods: April 2006 and September 2009. Most anesthesiologists received the questionnaires at departmental staff meetings and filled them out during these meetings; additionally, a local investigator approached anesthesiologists not present at these staff meetings individually. Primary endpoint questions were: handwashing, removal of wristwatch/jewelry, wearing mask, wearing hat/cap, wearing sterile gown; answering options were: "always", "usually", "rarely" or "never". Primary endpoint for analysis: respondents who both always wash their hands and always wear a mask ("handwash-mask composite") - "always" versus "any other response". We used logistic regression to perform the analysis. Time (2006, 2009) and hospital were included in the analysis as fixed effects. Results 135/160 (in 2006) and 127/164 (in 2009) anesthesiologists responded to the surveys; response rate 84% and 77% respectively. Respondents constituted 23% of the national anesthesiologist workforce. The main outcome "handwash-mask composite" was significantly increased after guideline publication (33% vs 58%; p = 0.0003). In addition, significant increases were seen for handwashing (37% vs 63%; p = 0.0004), wearing of mask (61% vs 78%; p < 0.0001), hat/cap (53% vs 76%; p = 0.0011) and wearing sterile gown (32% vs 51%; p < 0.0001). An apparent improvement in aseptic technique from 2006 to 2009 is noted across all hospitals and all physician groups. Conclusion Self-reported aseptic technique by Israeli anesthesiologists improved in the survey conducted after the publication of international guidelines. Although the before-after study design cannot prove a cause-effect relationship, it does show an association between the publication of international guidelines and significant improvement in self-reported aseptic technique. PMID:24661425
Improving Signal Detection using Allan and Theo Variances
NASA Astrophysics Data System (ADS)
Hardy, Andrew; Broering, Mark; Korsch, Wolfgang
2017-09-01
Precision measurements often deal with small signals buried within electronic noise. Extracting these signals can be enhanced through digital signal processing. Improving these techniques provide signal to noise ratios. Studies presently performed at the University of Kentucky are utilizing the electro-optic Kerr effect to understand cell charging effects within ultra-cold neutron storage cells. This work is relevant for the neutron electric dipole moment (nEDM) experiment at Oak Ridge National Laboratory. These investigations, and future investigations in general, will benefit from the illustrated improved analysis techniques. This project will showcase various methods for determining the optimum duration that data should be gathered for. Typically, extending the measuring time of an experimental run reduces the averaged noise. However, experiments also encounter drift due to fluctuations which mitigate the benefits of extended data gathering. Through comparing FFT averaging techniques, along with Allan and Theo variance measurements, quantifiable differences in signal detection will be presented. This research is supported by DOE Grants: DE-FG02-99ER411001, DE-AC05-00OR22725.
NASA Astrophysics Data System (ADS)
Zink, Frank Edward
The detection and classification of pulmonary nodules is of great interest in chest radiography. Nodules are often indicative of primary cancer, and their detection is particularly important in asymptomatic patients. The ability to classify nodules as calcified or non-calcified is important because calcification is a positive indicator that the nodule is benign. Dual-energy methods offer the potential to improve both the detection and classification of nodules by allowing the formation of material-selective images. Tissue-selective images can improve detection by virtue of the elimination of obscuring rib structure. Bone -selective images are essentially calcium images, allowing classification of the nodule. A dual-energy technique is introduced which uses a computed radiography system to acquire dual-energy chest radiographs in a single-exposure. All aspects of the dual-energy technique are described, with particular emphasis on scatter-correction, beam-hardening correction, and noise-reduction algorithms. The adaptive noise-reduction algorithm employed improves material-selective signal-to-noise ratio by up to a factor of seven with minimal sacrifice in selectivity. A clinical comparison study is described, undertaken to compare the dual-energy technique to conventional chest radiography for the tasks of nodule detection and classification. Observer performance data were collected using the Free Response Observer Characteristic (FROC) method and the bi-normal Alternative FROC (AFROC) performance model. Results of the comparison study, analyzed using two common multiple observer statistical models, showed that the dual-energy technique was superior to conventional chest radiography for detection of nodules at a statistically significant level (p < .05). Discussion of the comparison study emphasizes the unique combination of data collection and analysis techniques employed, as well as the limitations of comparison techniques in the larger context of technology assessment.
The new ATLAS Fast Calorimeter Simulation
NASA Astrophysics Data System (ADS)
Schaarschmidt, J.; ATLAS Collaboration
2017-10-01
Current and future need for large scale simulated samples motivate the development of reliable fast simulation techniques. The new Fast Calorimeter Simulation is an improved parameterized response of single particles in the ATLAS calorimeter that aims to accurately emulate the key features of the detailed calorimeter response as simulated with Geant4, yet approximately ten times faster. Principal component analysis and machine learning techniques are used to improve the performance and decrease the memory need compared to the current version of the ATLAS Fast Calorimeter Simulation. A prototype of this new Fast Calorimeter Simulation is in development and its integration into the ATLAS simulation infrastructure is ongoing.
Tao, Dingyin; Zhang, Lihua; Shan, Yichu; Liang, Zhen; Zhang, Yukui
2011-01-01
High-performance liquid chromatography-electrospray ionization tandem mass spectrometry (HPLC-ESI-MS-MS) is regarded as one of the most powerful techniques for separation and identification of proteins. Recently, much effort has been made to improve the separation capacity, detection sensitivity, and analysis throughput of micro- and nano-HPLC, by increasing column length, reducing column internal diameter, and using integrated techniques. Development of HPLC columns has also been rapid, as a result of the use of submicrometer packing materials and monolithic columns. All these innovations result in clearly improved performance of micro- and nano-HPLC for proteome research.
New mechanistic insights in the NH 3-SCR reactions at low temperature
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggeri, Maria Pia; Selleri, Tomasso; Nova, Isabella
2016-05-06
The present study is focused on the investigation of the low temperature Standard SCR reaction mechanism over Fe- and Cu-promoted zeolites. Different techniques are employed, including in situ DRIFTS, transient reaction analysis and chemical trapping techniques. The results present strong evidence of nitrite formation in the oxidative activation of NO and of their role in SCR reactions. These elements lead to a deeper understanding of the standard SCR chemistry at low temperature and can potentially improve the consistency of mechanistic mathematical models. Furthermore, comprehension of the mechanism on a fundamental level can contribute to the development of improved SCR catalysts.
NASA Astrophysics Data System (ADS)
Bordovsky, Michal; Catrysse, Peter; Dods, Steven; Freitas, Marcio; Klein, Jackson; Kotacka, Libor; Tzolov, Velko; Uzunov, Ivan M.; Zhang, Jiazong
2004-05-01
We present the state of the art for commercial design and simulation software in the 'front end' of photonic circuit design. One recent advance is to extend the flexibility of the software by using more than one numerical technique on the same optical circuit. There are a number of popular and proven techniques for analysis of photonic devices. Examples of these techniques include the Beam Propagation Method (BPM), the Coupled Mode Theory (CMT), and the Finite Difference Time Domain (FDTD) method. For larger photonic circuits, it may not be practical to analyze the whole circuit by any one of these methods alone, but often some smaller part of the circuit lends itself to at least one of these standard techniques. Later the whole problem can be analyzed on a unified platform. This kind of approach can enable analysis for cases that would otherwise be cumbersome, or even impossible. We demonstrate solutions for more complex structures ranging from the sub-component layout, through the entire device characterization, to the mask layout and its editing. We also present recent advances in the above well established techniques. This includes the analysis of nano-particles, metals, and non-linear materials by FDTD, photonic crystal design and analysis, and improved models for high concentration Er/Yb co-doped glass waveguide amplifiers.
New test techniques and analytical procedures for understanding the behavior of advanced propellers
NASA Technical Reports Server (NTRS)
Stefko, G. L.; Bober, L. J.; Neumann, H. E.
1983-01-01
Analytical procedures and experimental techniques were developed to improve the capability to design advanced high speed propellers. Some results from the propeller lifting line and lifting surface aerodynamic analysis codes are compared with propeller force data, probe data and laser velocimeter data. In general, the code comparisons with data indicate good qualitative agreement. A rotating propeller force balance demonstrated good accuracy and reduced test time by 50 percent. Results from three propeller flow visualization techniques are shown which illustrate some of the physical phenomena occurring on these propellers.
NASA Technical Reports Server (NTRS)
Katzberg, S. J.
1974-01-01
A primary limitation of many solid state photoconductors used in electro-optical systems is their slow response in converting varying light intensities into electrical signals. An optical feedback technique is presented which can extend the frequency response of systems that use these detectors by orders of magnitude without adversely affecting overall signal-to-noise ratio performance. The technique is analyzed to predict the improvement possible and a system is implemented using cadmium sulfide to demonstrate the effectiveness of the technique and the validity of the analysis.
NASA Astrophysics Data System (ADS)
Glass, John O.; Reddick, Wilburn E.; Reeves, Cara; Pui, Ching-Hon
2004-05-01
Reliably quantifying therapy-induced leukoencephalopathy in children treated for cancer is a challenging task due to its varying MR properties and similarity to normal tissues and imaging artifacts. T1, T2, PD, and FLAIR images were analyzed for a subset of 15 children from an institutional protocol for the treatment of acute lymphoblastic leukemia. Three different analysis techniques were compared to examine improvements in the segmentation accuracy of leukoencephalopathy versus manual tracings by two expert observers. The first technique utilized no apriori information and a white matter mask based on the segmentation of the first serial examination of each patient. MR images were then segmented with a Kohonen Self-Organizing Map. The other two techniques combine apriori maps from the ICBM atlas spatially normalized to each patient and resliced using SPM99 software. The apriori maps were included as input and a gradient magnitude threshold calculated on the FLAIR images was also utilized. The second technique used a 2-dimensional threshold, while the third algorithm utilized a 3-dimensional threshold. Kappa values were compared for the three techniques to each observer, and improvements were seen with each addition to the original algorithm (Observer 1: 0.651, 0.653, 0.744; Observer 2: 0.603, 0.615, 0.699).
NASA Technical Reports Server (NTRS)
Oswald, Hayden; Molthan, Andrew L.
2011-01-01
Satellite remote sensing has gained widespread use in the field of operational meteorology. Although raw satellite imagery is useful, several techniques exist which can convey multiple types of data in a more efficient way. One of these techniques is multispectral compositing. The NASA Short-term Prediction Research and Transition (SPoRT) Center has developed two multispectral satellite imagery products which utilize data from the Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA's Terra and Aqua satellites, based upon products currently generated and used by the European Organization for the Exploitation of Meteorological Satellites (EUMETSAT). The nighttime microphysics product allows users to identify clouds occurring at different altitudes, but emphasizes fog and low cloud detection. This product improves upon current spectral difference and single channel infrared techniques. Each of the current products has its own set of advantages for nocturnal fog detection, but each also has limiting drawbacks which can hamper the analysis process. The multispectral product combines each current product with a third channel difference. Since the final image is enhanced with color, it simplifies the fog identification process. Analysis has shown that the nighttime microphysics imagery product represents a substantial improvement to conventional fog detection techniques, as well as provides a preview of future satellite capabilities to forecasters.
NASA Astrophysics Data System (ADS)
McReynolds, Naomi; Auñón Garcia, Juan M.; Guengerich, Zoe; Smith, Terry K.; Dholakia, Kishan
2017-02-01
We present an optical spectroscopic technique, making use of both Raman signals and fluorescence spectroscopy, for the identification of five brands of commercially available extra-virgin olive-oil (EVOO). We demonstrate our technique on both a `bulk-optics' free-space system and a compact device. Using the compact device, which is capable of recording both Raman and fluorescence signals, we achieved an average sensitivity and specificity of 98.4% and 99.6% for discrimination, respectively. Our approach demonstrates that both Raman and fluorescence spectroscopy can be used for portable discrimination of EVOOs which obviates the need to use centralised laboratories and opens up the prospect of in-field testing. This technique may enable detection of EVOO that has undergone counterfeiting or adulteration. One of the main challenges facing Raman spectroscopy for use in quality control of EVOOs is that the oxidation of EVOO, which naturally occurs due to aging, causes shifts in Raman spectra with time, which implies regular retraining would be necessary. We present a potential method of analysis to minimize the effect that aging has on discrimination efficiency; we show that by discarding the first principal component, which contains information on the variations due to oxidation, we can improve discrimination efficiency thus improving the robustness of our technique.
New ORNL Method Could Unleash Solar Power Potential
Simpson, Mary Jane
2018-01-16
Measurement and data analysis techniques developed at the Department of Energyâs Oak Ridge National Laboratory could provide new insight into performance-robbing flaws in crystalline structures, ultimately improving the performance of solar cells.
Visual cluster analysis and pattern recognition template and methods
Osbourn, Gordon Cecil; Martinez, Rubel Francisco
1999-01-01
A method of clustering using a novel template to define a region of influence. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques.
Research and Development in Very Long Baseline Interferometry (VLBI)
NASA Technical Reports Server (NTRS)
Himwich, William E.
2004-01-01
Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.
Improving Efficiency with Work Sampling.
ERIC Educational Resources Information Center
Friedman, Mark; Hertz, Paul
1982-01-01
Work sampling is a managerial accounting technique which provides information about the efficiency of an operation. This analysis determines what tasks are being performed durinq a period of time to ascertain if time and effort are being allocated efficiently. (SK)
Comparison of measured and calculated dynamic loads for the Mod-2 2.5 mW wind turbine system
NASA Technical Reports Server (NTRS)
Zimmerman, D. K.; Shipley, S. A.; Miller, R. D.
1995-01-01
The Boeing Company, under contract to the Electric Power Research Institute (EPRI), has completed a test program on the Mod-2 wind turbines at Goodnoe Hills, Washington. The objectives were to update fatigue load spectra, discern site and machine differences, measure vortex generator effects, and to evaluate rotational sampling techniques. This paper shows the test setup and loads instrumentation, loads data comparisons and test/analysis correlations. Test data are correlated with DYLOSAT predictions using both the NASA interim turbulence model and rotationally sampled winds as inputs. The latter is demonstrated to have the potential to improve the test/analysis correlations. The paper concludes with an assessment of the importance of vortex generators, site dependence, and machine differences on fatigue loads. The adequacy of prediction techniques used are evaluated and recommendations are made for improvements to the methodology.
Thermal and mechanical behavior of flame retardant epoxy-polyesterurethane blends
NASA Astrophysics Data System (ADS)
Patel, R. H.; Hirani, A. V.; Kachhia, P. H.
2016-05-01
Polyesterurethanes are used in different applications due to their unique combination of the properties like toughness, flexibility, solvent resistance, etc. Nowadays flame retardant properties of polymers are of commercial interest because of their potential use in high performance applications. In the present study attempts have been taken to improve the flame retardant properties of conventional epoxy resin by incorporating phosphorus based polyesterurethane. Polyesterurethane has been synthesized in the laboratory and characterized by chemical and instrumental analysis techniques. Thermal stability and char value of the blends have been determined using thermogravimetric analysis technique. Limiting Oxygen Index (LOI) and UL-94 test methods have been used to determine the flame retardant properties of neat polymer and their blends in film form. Mechanical properties like tensile strength, elongation and impact resistance of the blends have been found out. Polyblend of epoxy resin with phosphorus based polyesterurethane has improved flame retardant properties compare to neat epoxy resin.
Guided SAR image despeckling with probabilistic non local weights
NASA Astrophysics Data System (ADS)
Gokul, Jithin; Nair, Madhu S.; Rajan, Jeny
2017-12-01
SAR images are generally corrupted by granular disturbances called speckle, which makes visual analysis and detail extraction a difficult task. Non Local despeckling techniques with probabilistic similarity has been a recent trend in SAR despeckling. To achieve effective speckle suppression without compromising detail preservation, we propose an improvement for the existing Generalized Guided Filter with Bayesian Non-Local Means (GGF-BNLM) method. The proposed method (Guided SAR Image Despeckling with Probabilistic Non Local Weights) replaces parametric constants based on heuristics in GGF-BNLM method with dynamically derived values based on the image statistics for weight computation. Proposed changes make GGF-BNLM method adaptive and as a result, significant improvement is achieved in terms of performance. Experimental analysis on SAR images shows excellent speckle reduction without compromising feature preservation when compared to GGF-BNLM method. Results are also compared with other state-of-the-art and classic SAR depseckling techniques to demonstrate the effectiveness of the proposed method.
Deep Learning Nuclei Detection in Digitized Histology Images by Superpixels
Sornapudi, Sudhir; Stanley, Ronald Joe; Stoecker, William V.; Almubarak, Haidar; Long, Rodney; Antani, Sameer; Thoma, George; Zuna, Rosemary; Frazier, Shelliane R.
2018-01-01
Background: Advances in image analysis and computational techniques have facilitated automatic detection of critical features in histopathology images. Detection of nuclei is critical for squamous epithelium cervical intraepithelial neoplasia (CIN) classification into normal, CIN1, CIN2, and CIN3 grades. Methods: In this study, a deep learning (DL)-based nuclei segmentation approach is investigated based on gathering localized information through the generation of superpixels using a simple linear iterative clustering algorithm and training with a convolutional neural network. Results: The proposed approach was evaluated on a dataset of 133 digitized histology images and achieved an overall nuclei detection (object-based) accuracy of 95.97%, with demonstrated improvement over imaging-based and clustering-based benchmark techniques. Conclusions: The proposed DL-based nuclei segmentation Method with superpixel analysis has shown improved segmentation results in comparison to state-of-the-art methods. PMID:29619277
Enhanced Higgs boson to τ(+)τ(-) search with deep learning.
Baldi, P; Sadowski, P; Whiteson, D
2015-03-20
The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.
A new perspective on global mean sea level (GMSL) acceleration
NASA Astrophysics Data System (ADS)
Watson, Phil J.
2016-06-01
The vast body of contemporary climate change science is largely underpinned by the premise of a measured acceleration from anthropogenic forcings evident in key climate change proxies -- greenhouse gas emissions, temperature, and mean sea level. By virtue, over recent years, the issue of whether or not there is a measurable acceleration in global mean sea level has resulted in fierce, widespread professional, social, and political debate. Attempts to measure acceleration in global mean sea level (GMSL) have often used comparatively crude analysis techniques providing little temporal instruction on these key questions. This work proposes improved techniques to measure real-time velocity and acceleration based on five GMSL reconstructions spanning the time frame from 1807 to 2014 with substantially improved temporal resolution. While this analysis highlights key differences between the respective reconstructions, there is now more robust, convincing evidence of recent acceleration in the trend of GMSL.
Bilek, Maciej; Namieśnik, Jacek
2016-01-01
For a long time, chromatographic techniques and techniques related to them have stimulated the development of new procedures in the field of pharmaceutical analysis. The newly developed methods, characterized by improved metrological parameters, allow for more accurate testing of, among others, the composition of raw materials, intermediates and final products. The chromatographic techniques also enable studies on waste generated in research laboratories and factories producing pharmaceuticals and parapharmaceuticals. Based on the review of reports published in Polish pharmaceutical journals, we assessed the impact of chromatographic techniques on the development of pharmaceutical analysis. The first chromatographic technique used in pharmaceutical analysis was a so-called capillary analysis. It was applied in the 1930s to control the identity of pharmaceutical formulations. In the 1940s and 1950s, the chromatographic techniques were mostly a subject of review publications, while their use in experimental work was rare. Paper chromatography and thin layer chromatography were introduced in the 1960s and 1970s, respectively. These new analytical tools have contributed to the intensive development of research in the field of phytochemistry and the analysis of herbal medicines. The development of colunm chromatography-based techniques, i.e., gas chromatography and high performance liquid chromatography took place in the end of 20th century. Both aforementioned techniques were widely applied in pharmaceutical analysis, for example, to assess the stability of drugs, test for impurities and degradation products as well as in pharmacokinetics studies. The first decade of 21" century was the time of new detection methods in gas and liquid chromatography. The information sources used to write this article were Polish pharmaceutical journals, both professional and scientific, originating from the interwar and post-war period, i.e., "Kronika Farmaceutyczna", "Farmacja Współczesna", "Wiadomości Farmaceutyczne", "Acta Poloniae Pharmaceutica", "Farmacja Polska", "Dissertationes Pharmaceuticae", "Annales UMCS sectio DDD Phamacia". The number of published works using various chromatography techniques was assessed based on the content description of individual issues of the journal "Acta Poloniae Pharmaceutica".
Function modeling: improved raster analysis through delayed reading and function raster datasets
John S. Hogland; Nathaniel M. Anderson; J .Greg Jones
2013-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space, often limiting the types of analyses that can be performed. To address this issue, we have developed Function Modeling. Function Modeling is a new modeling framework that streamlines the...
Analysis of earth rotation solution from Starlette
NASA Technical Reports Server (NTRS)
Schutz, B. E.; Cheng, M. K.; Shum, C. K.; Eanes, R. J.; Tapley, B. D.
1989-01-01
Earth rotation parameter (ERP) solutions were derived from the Starlette orbit analysis during the Main MERIT Campaign, using a technique of a consider-covariance analysis to assess the effects of errors on the polar motion solutions. The polar motion solution was then improved through the simultaneous adjustment of some dynamical parameters representing identified dominant perturbing sources (such as the geopotential and ocean-tide coefficients) on the polar motion solutions. Finally, an improved ERP solution was derived using the gravity field model, PTCF1, described by Tapley et al. (1986). The accuracy of the Starlette ERP solution was assessed by a comparison with the LAGEOS-derived ERP solutions.
Application of kaizen methodology to foster departmental engagement in quality improvement.
Knechtges, Paul; Decker, Michael Christopher
2014-12-01
The Toyota Production System, also known as Lean, is a structured approach to continuous quality improvement that has been developed over the past 50 years to transform the automotive manufacturing process. In recent years, these techniques have been successfully applied to quality and safety improvement in the medical field. One of these techniques is kaizen, which is the Japanese word for "good change." The central tenant of kaizen is the quick analysis of the small, manageable components of a problem and the rapid implementation of a solution with ongoing, real-time reassessment. Kaizen adds an additional "human element" that all stakeholders, not just management, must be involved in such change. Because of the small size of the changes involved in a kaizen event and the inherent focus on human factors and change management, a kaizen event can serve as good introduction to continuous quality improvement for a radiology department. Copyright © 2014. Published by Elsevier Inc.
Off-line, built-in test techniques for VLSI circuits
NASA Technical Reports Server (NTRS)
Buehler, M. G.; Sievers, M. W.
1982-01-01
It is shown that the use of redundant on-chip circuitry improves the testability of an entire VLSI circuit. In the study described here, five techniques applied to a two-bit ripple carry adder are compared. The techniques considered are self-oscillation, self-comparison, partition, scan path, and built-in logic block observer. It is noted that both classical stuck-at faults and nonclassical faults, such as bridging faults (shorts), stuck-on x faults where x may be 0, 1, or vary between the two, and parasitic flip-flop faults occur in IC structures. To simplify the analysis of the testing techniques, however, a stuck-at fault model is assumed.
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo T.; Murman, Scott M.; Madavan, Nateri K.
2016-01-01
The perfectly matched layer (PML) technique is developed in the context of a high- order spectral-element Discontinuous-Galerkin (DG) method. The technique is applied to a range of test cases and is shown to be superior compared to other approaches, such as those based on using characteristic boundary conditions and sponge layers, for treating the inflow and outflow boundaries of computational domains. In general, the PML technique improves the quality of the numerical results for simulations of practical flow configurations, but it also exhibits some instabilities for large perturbations. A preliminary analysis that attempts to understand the source of these instabilities is discussed.
The multi-channel infrared sea truth radiometric calibrator (MISTRC)
Suarez, M.J.; Emery, W. J.; Wick, G.A.
1997-01-01
A new multichannel infrared sea truth radiometer has been designed and built to improve validation of satellite-determined sea surface temperature. Horizontal grid polarized filters installed on the shortwave channels are very effective in reducing reflected solar radiation and in improving the noise characteristics. The system uses a continuous (every other cycle) seawater calibration technique. An analysis of the data from its first deployment is presented and recommendations are made for further improving the experimental method.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Further SEASAT SAR coastal ocean wave analysis
NASA Technical Reports Server (NTRS)
Kasischke, E. S.; Shuchman, R. A.; Meadows, G. A.; Jackson, P. L.; Tseng, Y.
1981-01-01
Analysis techniques used to exploit SEASAT synthetic aperture radar (SAR) data of gravity waves are discussed and the SEASAT SAR's ability to monitor large scale variations in gravity wave fields in both deep and shallow water is evaluated. The SAR analysis techniques investigated included motion compensation adjustments and the semicausal model for spectral analysis of SAR wave data. It was determined that spectra generated from fast Fourier transform analysis (FFT) of SAR wave data were not significantly altered when either range telerotation adjustments or azimuth focus shifts were used during processing of the SAR signal histories, indicating that SEASAT imagery of gravity waves is not significantly improved or degraded by motion compensation adjustments. Evaluation of the semicausal (SC) model using SEASAT SAR data from Rev. 974 indicates that the SC spectral estimates were not significantly better than the FFT results.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, L.; Lanza, R.C.
1999-12-01
The authors have developed a near field coded aperture imaging system for use with fast neutron techniques as a tool for the detection of contraband and hidden explosives through nuclear elemental analysis. The technique relies on the prompt gamma rays produced by fast neutron interactions with the object being examined. The position of the nuclear elements is determined by the location of the gamma emitters. For existing fast neutron techniques, in Pulsed Fast Neutron Analysis (PFNA), neutrons are used with very low efficiency; in Fast Neutron Analysis (FNS), the sensitivity for detection of the signature gamma rays is very low.more » For the Coded Aperture Fast Neutron Analysis (CAFNA{reg{underscore}sign}) the authors have developed, the efficiency for both using the probing fast neutrons and detecting the prompt gamma rays is high. For a probed volume of n{sup 3} volume elements (voxels) in a cube of n resolution elements on a side, they can compare the sensitivity with other neutron probing techniques. As compared to PFNA, the improvement for neutron utilization is n{sup 2}, where the total number of voxels in the object being examined is n{sup 3}. Compared to FNA, the improvement for gamma-ray imaging is proportional to the total open area of the coded aperture plane; a typical value is n{sup 2}/2, where n{sup 2} is the number of total detector resolution elements or the number of pixels in an object layer. It should be noted that the actual signal to noise ratio of a system depends also on the nature and distribution of background events and this comparison may reduce somewhat the effective sensitivity of CAFNA. They have performed analysis, Monte Carlo simulations, and preliminary experiments using low and high energy gamma-ray sources. The results show that a high sensitivity 3-D contraband imaging and detection system can be realized by using CAFNA.« less
NASA Astrophysics Data System (ADS)
Kuntoro, Hadiyan Yusuf; Hudaya, Akhmad Zidni; Dinaryanto, Okto; Majid, Akmal Irfan; Deendarlianto
2016-06-01
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (hL) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id
Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methodsmore » and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.« less
Zweerink, Alwin; Allaart, Cornelis P; Kuijer, Joost P A; Wu, LiNa; Beek, Aernout M; van de Ven, Peter M; Meine, Mathias; Croisille, Pierre; Clarysse, Patrick; van Rossum, Albert C; Nijveldt, Robin
2017-12-01
Although myocardial strain analysis is a potential tool to improve patient selection for cardiac resynchronization therapy (CRT), there is currently no validated clinical approach to derive segmental strains. We evaluated the novel segment length in cine (SLICE) technique to derive segmental strains from standard cardiovascular MR (CMR) cine images in CRT candidates. Twenty-seven patients with left bundle branch block underwent CMR examination including cine imaging and myocardial tagging (CMR-TAG). SLICE was performed by measuring segment length between anatomical landmarks throughout all phases on short-axis cines. This measure of frame-to-frame segment length change was compared to CMR-TAG circumferential strain measurements. Subsequently, conventional markers of CRT response were calculated. Segmental strains showed good to excellent agreement between SLICE and CMR-TAG (septum strain, intraclass correlation coefficient (ICC) 0.76; lateral wall strain, ICC 0.66). Conventional markers of CRT response also showed close agreement between both methods (ICC 0.61-0.78). Reproducibility of SLICE was excellent for intra-observer testing (all ICC ≥0.76) and good for interobserver testing (all ICC ≥0.61). The novel SLICE post-processing technique on standard CMR cine images offers both accurate and robust segmental strain measures compared to the 'gold standard' CMR-TAG technique, and has the advantage of being widely available. • Myocardial strain analysis could potentially improve patient selection for CRT. • Currently a well validated clinical approach to derive segmental strains is lacking. • The novel SLICE technique derives segmental strains from standard CMR cine images. • SLICE-derived strain markers of CRT response showed close agreement with CMR-TAG. • Future studies will focus on the prognostic value of SLICE in CRT candidates.
Neutron beam measurement of industrial polymer materials for composition and bulk integrity
NASA Astrophysics Data System (ADS)
Rogante, M.; Rosta, L.; Heaton, M. E.
2013-10-01
Neutron beam techniques, among other non-destructive diagnostics, are particularly irreplaceable in the complete analysis of industrial materials and components when supplying fundamental information. In this paper, nanoscale small-angle neutron scattering analysis and prompt gamma activation analysis for the characterization of industrial polymers are considered. The basic theoretical aspects are briefly introduced and some applications are presented. The investigations of the SU-8 polymer in axial airflow microturbines—i.e. microelectromechanical systems—are presented foremost. Also presented are full and feasibility studies on polyurethanes, composites based on cross-linked polymers reinforced by carbon fibres and polymer cement concrete. The obtained results have provided a substantial contribution to the improvement of the considered materials, and indeed confirmed the industrial applicability of the adopted techniques in the analysis of polymers.
Walsh, Alex J.; Sharick, Joe T.; Skala, Melissa C.; Beier, Hope T.
2016-01-01
Time-correlated single photon counting (TCSPC) enables acquisition of fluorescence lifetime decays with high temporal resolution within the fluorescence decay. However, many thousands of photons per pixel are required for accurate lifetime decay curve representation, instrument response deconvolution, and lifetime estimation, particularly for two-component lifetimes. TCSPC imaging speed is inherently limited due to the single photon per laser pulse nature and low fluorescence event efficiencies (<10%) required to reduce bias towards short lifetimes. Here, simulated fluorescence lifetime decays are analyzed by SPCImage and SLIM Curve software to determine the limiting lifetime parameters and photon requirements of fluorescence lifetime decays that can be accurately fit. Data analysis techniques to improve fitting accuracy for low photon count data were evaluated. Temporal binning of the decays from 256 time bins to 42 time bins significantly (p<0.0001) improved fit accuracy in SPCImage and enabled accurate fits with low photon counts (as low as 700 photons/decay), a 6-fold reduction in required photons and therefore improvement in imaging speed. Additionally, reducing the number of free parameters in the fitting algorithm by fixing the lifetimes to known values significantly reduced the lifetime component error from 27.3% to 3.2% in SPCImage (p<0.0001) and from 50.6% to 4.2% in SLIM Curve (p<0.0001). Analysis of nicotinamide adenine dinucleotide–lactate dehydrogenase (NADH-LDH) solutions confirmed temporal binning of TCSPC data and a reduced number of free parameters improves exponential decay fit accuracy in SPCImage. Altogether, temporal binning (in SPCImage) and reduced free parameters are data analysis techniques that enable accurate lifetime estimation from low photon count data and enable TCSPC imaging speeds up to 6x and 300x faster, respectively, than traditional TCSPC analysis. PMID:27446663
Subjective analysis of energy-management projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, R.
The most successful energy conservation projects always reflect human effort to fine-tune engineering and technological improvements. Subjective analysis is a technique for predicting and measuring human interaction before a project begins. The examples of a subjective analysis for office buildings incorporate evaluative questions that are structured to produce numeric values for computer scoring. Each project would need to develop its own pertinent questions and determine appropriate values for the answers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qi, X. Sharon, E-mail: xqi@mednet.ucla.edu; Ruan, Dan; Lee, Steve P.
2015-03-15
Purpose: To develop a practical workflow for retrospectively analyzing target and normal tissue dose–volume endpoints for various intensity modulated radiation therapy (IMRT) delivery techniques; to develop technique-specific planning goals to improve plan consistency and quality when feasible. Methods and Materials: A total of 165 consecutive head-and-neck patients from our patient registry were selected and retrospectively analyzed. All IMRT plans were generated using the same dose–volume guidelines for TomoTherapy (Tomo, Accuray), TrueBeam (TB, Varian) using fixed-field IMRT (TB-IMRT) or RAPIDARC (TB-RAPIDARC), or Siemens Oncor (Siemens-IMRT, Siemens). A MATLAB-based dose–volume extraction and analysis tool was developed to export dosimetric endpoints for eachmore » patient. With a fair stratification of patient cohort, the variation of achieved dosimetric endpoints was analyzed among different treatment techniques. Upon identification of statistically significant variations, technique-specific planning goals were derived from dynamically accumulated institutional data. Results: Retrospective analysis showed that although all techniques yielded comparable target coverage, the doses to the critical structures differed. The maximum cord doses were 34.1 ± 2.6, 42.7 ± 2.1, 43.3 ± 2.0, and 45.1 ± 1.6 Gy for Tomo, TB-IMRT, TB-RAPIDARC, and Siemens-IMRT plans, respectively. Analyses of variance showed significant differences for the maximum cord doses but no significant differences for other selected structures among the investigated IMRT delivery techniques. Subsequently, a refined technique-specific dose–volume guideline for maximum cord dose was derived at a confidence level of 95%. The dosimetric plans that failed the refined technique-specific planning goals were reoptimized according to the refined constraints. We observed better cord sparing with minimal variations for the target coverage and other organ at risk sparing for the Tomo cases, and higher parotid doses for C-arm linear accelerator–based IMRT and RAPIDARC plans. Conclusion: Patient registry–based processes allowed easy and systematic dosimetric assessment of treatment plan quality and consistency. Our analysis revealed the dependence of certain dosimetric endpoints on the treatment techniques. Technique-specific refinement of planning goals may lead to improvement in plan consistency and plan quality.« less
Recent Advances in Measurement and Dietary Mitigation of Enteric Methane Emissions in Ruminants
Patra, Amlan K.
2016-01-01
Methane (CH4) emission, which is mainly produced during normal fermentation of feeds by the rumen microorganisms, represents a major contributor to the greenhouse gas (GHG) emissions. Several enteric CH4 mitigation technologies have been explored recently. A number of new techniques have also been developed and existing techniques have been improved in order to evaluate CH4 mitigation technologies and prepare an inventory of GHG emissions precisely. The aim of this review is to discuss different CH4 measuring and mitigation technologies, which have been recently developed. Respiration chamber technique is still considered as a gold standard technique due to its greater precision and reproducibility in CH4 measurements. With the adoption of recent recommendations for improving the technique, the SF6 method can be used with a high level of precision similar to the chamber technique. Short-term measurement techniques of CH4 measurements generally invite considerable within- and between-animal variations. Among the short-term measuring techniques, Greenfeed and methane hood systems are likely more suitable for evaluation of CH4 mitigation studies, if measurements could be obtained at different times of the day relative to the diurnal cycle of the CH4 production. Carbon dioxide and CH4 ratio, sniffer, and other short-term breath analysis techniques are more suitable for on farm screening of large number of animals to generate the data of low CH4-producing animals for genetic selection purposes. Different indirect measuring techniques are also investigated in recent years. Several new dietary CH4 mitigation technologies have been explored, but only a few of them are practical and cost-effective. Future research should be directed toward both the medium- and long-term mitigation strategies, which could be utilized on farms to accomplish substantial reductions of CH4 emissions and to profitably reduce carbon footprint of livestock production systems. This review presents recent developments and critical analysis on different measurements and dietary mitigation of enteric CH4 emissions technologies. PMID:27243027
Recent Advances in Measurement and Dietary Mitigation of Enteric Methane Emissions in Ruminants.
Patra, Amlan K
2016-01-01
Methane (CH4) emission, which is mainly produced during normal fermentation of feeds by the rumen microorganisms, represents a major contributor to the greenhouse gas (GHG) emissions. Several enteric CH4 mitigation technologies have been explored recently. A number of new techniques have also been developed and existing techniques have been improved in order to evaluate CH4 mitigation technologies and prepare an inventory of GHG emissions precisely. The aim of this review is to discuss different CH4 measuring and mitigation technologies, which have been recently developed. Respiration chamber technique is still considered as a gold standard technique due to its greater precision and reproducibility in CH4 measurements. With the adoption of recent recommendations for improving the technique, the SF6 method can be used with a high level of precision similar to the chamber technique. Short-term measurement techniques of CH4 measurements generally invite considerable within- and between-animal variations. Among the short-term measuring techniques, Greenfeed and methane hood systems are likely more suitable for evaluation of CH4 mitigation studies, if measurements could be obtained at different times of the day relative to the diurnal cycle of the CH4 production. Carbon dioxide and CH4 ratio, sniffer, and other short-term breath analysis techniques are more suitable for on farm screening of large number of animals to generate the data of low CH4-producing animals for genetic selection purposes. Different indirect measuring techniques are also investigated in recent years. Several new dietary CH4 mitigation technologies have been explored, but only a few of them are practical and cost-effective. Future research should be directed toward both the medium- and long-term mitigation strategies, which could be utilized on farms to accomplish substantial reductions of CH4 emissions and to profitably reduce carbon footprint of livestock production systems. This review presents recent developments and critical analysis on different measurements and dietary mitigation of enteric CH4 emissions technologies.
Messaraa, C; Metois, A; Walsh, M; Hurley, S; Doyle, L; Mansfield, A; O'Connor, C; Mavon, A
2018-01-24
Skin topographic measurements are of paramount importance in the field of dermo-cosmetic evaluation. The aim of this study was to investigate how the Antera 3D, a multi-purpose handheld camera, correlates with other topographic techniques and changes in skin topography following the use of a cosmetic product. Skin topographic measurements were collected on 26 female volunteers aged 45-70 years with the Antera 3D, the DermaTOP and image analysis on parallel-polarized pictures. Different filters for analysis from the Antera 3D were investigated for repeatability, correlations with other imaging techniques and ability to detect improvements of skin topography following application of a serum. Most of Antera 3D parameters were found to be strongly correlated with the DermaTOP parameters. No association was found between the Antera 3D parameters and measurements on parallel-polarized photographs. The measurements repeatability was comparable among the different filters for analysis, with the exception of wrinkle max depth and roughness Rt. Following a single application of a tightening serum, both Antera 3D wrinkles and texture parameters were able to record significant improvements, with the best improvements observed with the large filter. The Antera 3D demonstrated its relevance for cosmetic product evaluation. We also provide recommendations for the analysis based on our findings. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm
NASA Astrophysics Data System (ADS)
Neri, P.
2017-05-01
Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.
Molecular diagnosis of bloodstream infections: planning to (physically) reach the bedside.
Leggieri, N; Rida, A; François, P; Schrenzel, Jacques
2010-08-01
Faster identification of infecting microorganisms and treatment options is a first-ranking priority in the infectious disease area, in order to prevent inappropriate treatment and overuse of broad-spectrum antibiotics. Standard bacterial identification is intrinsically time-consuming, and very recently there has been a burst in the number of commercially available nonphenotype-based techniques and in the documentation of a possible clinical impact of these techniques. Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) is now a standard diagnostic procedure on cultures and hold promises on spiked blood. Meanwhile, commercial PCR-based techniques have improved with the use of bacterial DNA enrichment methods, the diversity of amplicon analysis techniques (melting curve analysis, microarrays, gel electrophoresis, sequencing and analysis by mass spectrometry) leading to the ability to challenge bacterial culture as the gold standard for providing earlier diagnosis with a better 'clinical' sensitivity and additional prognostic information. Laboratory practice has already changed with MALDI-TOF MS, but a change in clinical practice, driven by emergent nucleic acid-based techniques, will need the demonstration of real-life applicability as well as robust clinical-impact-oriented studies.
Peleato, Nicolas M; Legge, Raymond L; Andrews, Robert C
2018-06-01
The use of fluorescence data coupled with neural networks for improved predictability of drinking water disinfection by-products (DBPs) was investigated. Novel application of autoencoders to process high-dimensional fluorescence data was related to common dimensionality reduction techniques of parallel factors analysis (PARAFAC) and principal component analysis (PCA). The proposed method was assessed based on component interpretability as well as for prediction of organic matter reactivity to formation of DBPs. Optimal prediction accuracies on a validation dataset were observed with an autoencoder-neural network approach or by utilizing the full spectrum without pre-processing. Latent representation by an autoencoder appeared to mitigate overfitting when compared to other methods. Although DBP prediction error was minimized by other pre-processing techniques, PARAFAC yielded interpretable components which resemble fluorescence expected from individual organic fluorophores. Through analysis of the network weights, fluorescence regions associated with DBP formation can be identified, representing a potential method to distinguish reactivity between fluorophore groupings. However, distinct results due to the applied dimensionality reduction approaches were observed, dictating a need for considering the role of data pre-processing in the interpretability of the results. In comparison to common organic measures currently used for DBP formation prediction, fluorescence was shown to improve prediction accuracies, with improvements to DBP prediction best realized when appropriate pre-processing and regression techniques were applied. The results of this study show promise for the potential application of neural networks to best utilize fluorescence EEM data for prediction of organic matter reactivity. Copyright © 2018 Elsevier Ltd. All rights reserved.
Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I
2016-10-18
There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.
Kittell, David E; Mares, Jesus O; Son, Steven F
2015-04-01
Two time-frequency analysis methods based on the short-time Fourier transform (STFT) and continuous wavelet transform (CWT) were used to determine time-resolved detonation velocities with microwave interferometry (MI). The results were directly compared to well-established analysis techniques consisting of a peak-picking routine as well as a phase unwrapping method (i.e., quadrature analysis). The comparison is conducted on experimental data consisting of transient detonation phenomena observed in triaminotrinitrobenzene and ammonium nitrate-urea explosives, representing high and low quality MI signals, respectively. Time-frequency analysis proved much more capable of extracting useful and highly resolved velocity information from low quality signals than the phase unwrapping and peak-picking methods. Additionally, control of the time-frequency methods is mainly constrained to a single parameter which allows for a highly unbiased analysis method to extract velocity information. In contrast, the phase unwrapping technique introduces user based variability while the peak-picking technique does not achieve a highly resolved velocity result. Both STFT and CWT methods are proposed as improved additions to the analysis methods applied to MI detonation experiments, and may be useful in similar applications.
NASA Astrophysics Data System (ADS)
Kim, J.; Kim, J. H.; Jee, G.; Lee, C.; Kim, Y.
2017-12-01
Spectral Airglow Temperature Imager (SATI) installed at King Sejong Station (62.22S, 58.78W), Antarctica, has been continuously measured the airglow emissions from OH (6-2) Meinel and O2 (0-1) atmospheric bands since 2002, in order to investigate the dynamics of the polar MLT region. The measurements allow us to derive the rotational temperature at peak emission heights known as about 87 km and 94 km for OH and O2 airglows, respectively. In this study, we briefly introduce improved analysis technique that modified original analysis code. The major change compared to original program is the improvement of the function to find the exact center position in the observed image. In addition to brief introduction of the improved technique, we also present the results statistically investigating the periodic variations on the temperatures of two layers during the period of 2002 through 2011 and compare our results with those from the temperatures measured by satellite.
Qualitative biomechanical principles for application in coaching.
Knudson, Duane
2007-01-01
Many aspects of human movements in sport can be readily understood by Newtonian rigid-body mechanics. Many of these laws and biomechanical principles, however, are counterintuitive to a lot of people. There are also several problems in the application of biomechanics to sports, so the application of biomechanics in the qualitative analysis of sport skills by many coaches has been limited. Biomechanics scholars have long been interested in developing principles that facilitate the qualitative application of biomechanics to improve movement performance and reduce the risk of injury. This paper summarizes the major North American efforts to establish a set of general biomechanical principles of movement, and illustrates how principles can be used to improve the application of biomechanics in the qualitative analysis of sport technique. A coach helping a player with a tennis serve is presented as an example. The standardization of terminology for biomechanical principles is proposed as an important first step in improving the application ofbiomechanics in sport. There is also a need for international cooperation and research on the effectiveness of applying biomechanical principles in the coaching of sport techniques.
NASA Astrophysics Data System (ADS)
Nahar, Jannatun; Johnson, Fiona; Sharma, Ashish
2018-02-01
Conventional bias correction is usually applied on a grid-by-grid basis, meaning that the resulting corrections cannot address biases in the spatial distribution of climate variables. To solve this problem, a two-step bias correction method is proposed here to correct time series at multiple locations conjointly. The first step transforms the data to a set of statistically independent univariate time series, using a technique known as independent component analysis (ICA). The mutually independent signals can then be bias corrected as univariate time series and back-transformed to improve the representation of spatial dependence in the data. The spatially corrected data are then bias corrected at the grid scale in the second step. The method has been applied to two CMIP5 General Circulation Model simulations for six different climate regions of Australia for two climate variables—temperature and precipitation. The results demonstrate that the ICA-based technique leads to considerable improvements in temperature simulations with more modest improvements in precipitation. Overall, the method results in current climate simulations that have greater equivalency in space and time with observational data.
NASA Technical Reports Server (NTRS)
Zavodsky, Bradley; Chou, Shih-Hung; Jedlovec, Gary
2012-01-01
Improvements to global and regional numerical weather prediction (NWP) have been demonstrated through assimilation of data from NASA s Atmospheric Infrared Sounder (AIRS). Current operational data assimilation systems use AIRS radiances, but impact on regional forecasts has been much smaller than for global forecasts. Retrieved profiles from AIRS contain much of the information that is contained in the radiances and may be able to reveal reasons for this reduced impact. Assimilating AIRS retrieved profiles in an identical analysis configuration to the radiances, tracking the quantity and quality of the assimilated data in each technique, and examining analysis increments and forecast impact from each data type can yield clues as to the reasons for the reduced impact. By doing this with regional scale models individual synoptic features (and the impact of AIRS on these features) can be more easily tracked. This project examines the assimilation of hyperspectral sounder data used in operational numerical weather prediction by comparing operational techniques used for AIRS radiances and research techniques used for AIRS retrieved profiles. Parallel versions of a configuration of the Weather Research and Forecasting (WRF) model with Gridpoint Statistical Interpolation (GSI) that mimics the analysis methodology, domain, and observational datasets for the regional North American Mesoscale (NAM) model run at the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC) are run to examine the impact of each type of AIRS data set. The first configuration will assimilate the AIRS radiance data along with other conventional and satellite data using techniques implemented within the operational system; the second configuration will assimilate AIRS retrieved profiles instead of AIRS radiances in the same manner. Preliminary results of this study will be presented and focus on the analysis impact of the radiances and profiles for selected cases.
Improved motors for utility applications: Volume 6, Squirrel-cage rotor analysis: Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, J.W.; McCoy, R.M.
1986-11-01
An analysis of squirrel cage induction motor rotors was undertaken in response to an Industry Assessment Study finding 10% of motor failures to be rotor related. The analysis focuses on evaluating rotor design life. The evaluation combines state-of-the-art electromagnetic, thermal, and structural solution techniques into an integrated analysis and presents a simple summary. Finite element techniques are central tools in the analysis. The analysis is applied to a specific forced draft fan drive design. Fans as a category of application have a higher failure rate than other categories of power station auxiliary motor applications. Forced-draft fan drives are one ofmore » the major fan drives which accelerate a relatively high value of rotor load inertia. Various starting and operating conditions are studied for this forced-draft fan drive motor including a representative application duty cycle.« less
Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians
NASA Astrophysics Data System (ADS)
Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von
2008-03-01
Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.
Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.
Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A
2018-01-01
Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.
Hernandez, Stephen C; Sibley, Haley; Fink, Daniel S; Kunduk, Melda; Schexnaildre, Mell; Kakade, Anagha; McWhorter, Andrew J
2016-05-01
Micronized acellular dermis has been used for nearly 15 years to correct glottic insufficiency. With previous demonstration of safety and efficacy, this study aims to evaluate intermediate and long-term voice outcomes in those who underwent injection laryngoplasty for unilateral vocal fold paralysis. Technique and timing of injection were also reviewed to assess their impact on outcomes. Case series with chart review. Tertiary care center. Patients undergoing injection laryngoplasty from May 2007 to September 2012 were reviewed for possible inclusion. Pre- and postoperative Voice Handicap Index (VHI) scores, as well as senior speech-language pathologists' blinded assessment of voice, were collected for analysis. The final sample included patients who underwent injection laryngoplasty for unilateral vocal fold paralysis, 33 of whom had VHI results and 37 of whom had voice recordings. Additional data were obtained, including technique and timing of injection. Analysis was performed on those patients above with VHI and perceptual voice grades before and at least 6 months following injection. Mean VHI improved by 28.7 points at 6 to 12 months and 22.8 points at >12 months (P = .001). Mean perceptual voice grades improved by 17.6 points at 6 to 12 months and 16.3 points at >12 months (P < .001). No statistically significant difference was found with technique or time to injection. Micronized acellular dermis is a safe injectable that improved both patient-completed voice ratings and blinded reviewer voice gradings at intermediate and long-term follow-up. Further investigation may be warranted regarding technique and timing of injection. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2016.
van Genugten, Lenneke; Dusseldorp, Elise; Massey, Emma K; van Empelen, Pepijn
2017-03-01
Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary prevention interventions on mental wellbeing in adolescents. Forty interventions were included in the analyses. Techniques were coded into nine categories of SRTs. Meta-analyses were conducted to identify the effectiveness of SRTs, examining three different outcomes: internalising behaviour, externalising behaviour, and self-esteem. Primary interventions had a small-to-medium ([Formula: see text] = 0.16-0.29) on self-esteem and internalising behaviour. Secondary interventions had a medium-to-large short-term effect (average [Formula: see text] = 0.56) on internalising behaviour and self-esteem. In secondary interventions, interventions including asking for social support [Formula: see text] 95% confidence interval, CI = 1.11-1.98) had a great effect on internalising behaviour. Interventions including monitoring and evaluation had a greater effect on self-esteem [Formula: see text] 95% CI = 0.21-0.57). For primary interventions, there was not a single SRT that was associated with a greater intervention effect on internalising behaviour or self-esteem. No effects were found for externalising behaviours. Self-regulation interventions are moderately effective at improving mental wellbeing among adolescents. Secondary interventions promoting 'asking for social support' and promoting 'monitoring and evaluation' were associated with improved outcomes. More research is needed to identify other SRTs or combinations of SRTs that could improve understanding or optimise mental wellbeing interventions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agrawal, Rakesh
2014-09-28
The development of renewable, affordable, and environmentally conscious means of generating energy on a global scale represents a grand challenge of our time. Due to the “permanence” of radiation from the sun, solar energy promises to remain a viable and sustainable power source far into the future. Established single-junction photovoltaic technologies achieve high power conversion efficiencies (pce) near 20% but require complicated manufacturing processes that prohibit the marriage of large-scale throughput (e.g. on the GW scale), profitability, and quality control. Our approach to this problem begins with the synthesis of nanocrystals of semiconductor materials comprising earth abundant elements and characterizedmore » by material and optoelectronic properties ideal for photovoltaic applications, namely Cu2ZnSn(S,Se)4 (CZTSSe). Once synthesized, such nanocrystals are formulated into an ink, coated onto substrates, and processed into completed solar cells in such a way that enables scale-up to high throughput, roll-to-roll manufacturing processes. This project aimed to address the major limitation to CZTSSe solar cell pce’s – the low open-circuit voltage (Voc) reported throughout literature for devices comprised of this material. Throughout the project significant advancements have been made in fundamental understanding of the CZTSSe material and device limitations associated with this material system. Additionally, notable improvements have been made to our nanocrystal based processing technique to alleviate performance limitations due to the identified device limitations. Notably, (1) significant improvements have been made in reducing intra- and inter-nanoparticle heterogeneity, (2) improvements in device performance have been realized with novel cation substitution in Ge-alloyed CZTGeSSe absorbers, (3) systematic analysis of absorber sintering has been conducted to optimize the selenization process for large grain CZTSSe absorbers, (4) novel electrical characterization analysis techniques have been developed to identify significant limitations to traditional electrical characterization of CZTSSe devices, and (5) the developed electrical analysis techniques have been used to identify the role that band gap and electrostatic potential fluctuations have in limiting device performance for this material system. The device modeling and characterization of CZTSSe undertaken with this project have significant implications for the CZTSSe research community, as the identified limitations due to potential fluctuations are expected to be a performance limitation to high-efficiency CZTSSe devices fabricated from all current processing techniques. Additionally, improvements realized through enhanced absorber processing conditions to minimize nanoparticle and large-grain absorber heterogeneity are suggested to be beneficial processing improvements which should be applied to CZTSSe devices fabricated from all processing techniques. Ultimately, our research has indicated that improved performance for CZTSSe will be achieved through novel absorber processing which minimizes defect formation, elemental losses, secondary phase formation, and compositional uniformity in CZTSSe absorbers; we believe this novel absorber processing can be achieved through nanocrystal based processing of CZTSSe which is an active area of research at the conclusion of this award. While significant fundamental understanding of CZTSSe and the performance limitations associated with this material system, as well as notable improvements in the processing of nanocrystal based CZTSSe absorbers, have been achieved under this project, the limitation of two years of research funding towards our goals prevents further significant advancements directly identified through pce. improvements relative to those reported herein. As the characterization and modeling subtask of this project has been the main driving force for understanding device limitations, the conclusions of this analysis have just recently been applied to the processing of nanocrystal based CZTSSe absorbers -- with notable success. We expect the notable fundamental understanding of device limitations and absorber sintering achieved under this project will lead to significant improvements in device performance for CZTSSe devices in the near future for devices fabricated from a variety of processing techniques« less
PSH Transient Simulation Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muljadi, Eduard
PSH Transient Simulation Modeling presentation from the WPTO FY14 - FY16 Peer Review. Transient effects are an important consideration when designing a PSH system, yet numerical techniques for hydraulic transient analysis still need improvements for adjustable-speed (AS) reversible pump-turbine applications.
In-situ Polymerization of Polyaniline/Polypyrrole Copolymer using Different Techniques
NASA Astrophysics Data System (ADS)
Hammad, A. S.; Noby, H.; Elkady, M. F.; El-Shazly, A. H.
2018-01-01
The morphology and surface area of the poly(aniline-co-pyrrole) copolymer (PANPY) are important properties which improve the efficiency of the copolymer in various applications. In this investigation, different techniques were employed to produce PANPY in different morphologies. Aniline and pyrrole were used as monomers, and ammonium peroxydisulfate (APS) was used as an oxidizer with uniform molar ratio. Rapid mixing, drop-wise mixing, and supercritical carbon dioxide (ScCO2) polymerization techniques were appointed. The chemical structure, crystallinity, porosity, and morphology of the composite were distinguished by Fourier transform infrared spectroscopy (FT-IR), X-ray diffraction (XRD), Brunauer, Emmett and Teller (BET) analysis, and transmission electron microscopy (TEM) respectively. The characterization tests indicated that the polyaniline/polypyrrole copolymer was successfully prepared with different morphologies. Based on the obtained TEM, hollow nanospheres were formed using rapid mixing technique with acetic acid that have a diameter of 75 nm and thickness 26 nm approximately. Also, according to the XRD, the produced structures have a semi- crystalline structure. The synthesized copolymer with ScCO2-assisted polymerization technique showed improved surface area (38.1 m2/g) with HCl as dopant.
NASA Astrophysics Data System (ADS)
Busthanul, N.; Lumoindong, Y.; Syafiuddin, M.; Heliawaty; Lanuhu, N.; Ibrahim, T.; Ambrosius, R. R.
2018-05-01
Farmers’ attitudes and perceptions may be the cause of ineffective implementation of conservation farming for agriculture sustainability due to vary of implementing of conservation techniques. The purpose of this research is to know the attitude and perception of farmer toward the application of conservation technique and to know correlation between farmer attitude and perception toward the application of conservation technique. The research was carried out in Kanreapia Village, Tombolo Pao District, Gowa Regency, South Sulawesi Province, Indonesia. Sampling was done by randomly with 30 farmers; using non-parametric statistics with quantitative and qualitative descriptive data analysis approach, using Likert scale. The result showed that farmer attitude and perception toward conservation technique implementation which having the highest category (appropriate) is seasonal crop rotation, while the lowest with less appropriate category is the processing of land according to the contour and the cultivation of the plants accordingly. There is a very strong relationship between farmer attitude and perception. The implications of the findings are that improvements the implementation of conservation farming techniques should be made through improved perceptions.
Improving microstructural quantification in FIB/SEM nanotomography.
Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G
2018-01-01
FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofer, Thomas James
2014-12-01
The CDMS-II phase of the Cryogenic Dark Matter Search, a dark matter direct-detection experiment, was operated at the Soudan Underground Laboratory from 2003 to 2008. The full payload consisted of 30 ZIP detectors, totaling approximately 1.1 kg of Si and 4.8 kg of Ge, operated at temperatures of 50 mK. The ZIP detectors read out both ionization and phonon pulses from scatters within the crystals; channel segmentation and analysis of pulse timing parameters allowed e ective ducialization of the crystal volumes and background rejection su cient to set world-leading limits at the times of their publications. A full re-analysis ofmore » the CDMS-II data was motivated by an improvement in the event reconstruction algorithms which improved the resolution of ionization energy and timing information. The Ge data were re-analyzed using three distinct background-rejection techniques; the Si data from runs 125 - 128 were analyzed for the rst time using the most successful of the techniques from the Ge re-analysis. The results of these analyses prompted a novel \\mid-threshold" analysis, wherein energy thresholds were lowered but background rejection using phonon timing information was still maintained. This technique proved to have signi cant discrimination power, maintaining adequate signal acceptance and minimizing background leakage. The primary background for CDMS-II analyses comes from surface events, whose poor ionization collection make them di cult to distinguish from true nuclear recoil events. The novel detector technology of SuperCDMS, the successor to CDMS-II, uses interleaved electrodes to achieve full ionization collection for events occurring at the top and bottom detector surfaces. This, along with dual-sided ionization and phonon instrumentation, allows for excellent ducialization and relegates the surface-event rejection techniques of CDMS-II to a secondary level of background discrimination. Current and future SuperCDMS results hold great promise for mid- to low-mass WIMP-search results.« less
NASA Technical Reports Server (NTRS)
1990-01-01
Various papers on remote sensing (RS) for the nineties are presented. The general topics addressed include: subsurface methods, radar scattering, oceanography, microwave models, atmospheric correction, passive microwave systems, RS in tropical forests, moderate resolution land analysis, SAR geometry and SNR improvement, image analysis, inversion and signal processing for geoscience, surface scattering, rain measurements, sensor calibration, wind measurements, terrestrial ecology, agriculture, geometric registration, subsurface sediment geology, radar modulation mechanisms, radar ocean scattering, SAR calibration, airborne radar systems, water vapor retrieval, forest ecosystem dynamics, land analysis, multisensor data fusion. Also considered are: geologic RS, RS sensor optical measurements, RS of snow, temperature retrieval, vegetation structure, global change, artificial intelligence, SAR processing techniques, geologic RS field experiment, stochastic modeling, topography and Digital Elevation model, SAR ocean waves, spaceborne lidar and optical, sea ice field measurements, millimeter waves, advanced spectroscopy, spatial analysis and data compression, SAR polarimetry techniques. Also discussed are: plant canopy modeling, optical RS techniques, optical and IR oceanography, soil moisture, sea ice back scattering, lightning cloud measurements, spatial textural analysis, SAR systems and techniques, active microwave sensing, lidar and optical, radar scatterometry, RS of estuaries, vegetation modeling, RS systems, EOS/SAR Alaska, applications for developing countries, SAR speckle and texture.
Recent development of electrochemiluminescence sensors for food analysis.
Hao, Nan; Wang, Kun
2016-10-01
Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...
2017-05-23
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra
Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less
NASA Astrophysics Data System (ADS)
Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.
2015-09-01
Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system's dynamics, particularly of peak runoff flows.
Perinatal legislative policies and health outcomes.
Lorch, Scott A
2017-10-01
Perinatal epidemiology examines the variation and determinants of pregnancy outcomes from a maternal and neonatal perspective. However, improving public and population health also requires the translation of this evidence base into substantive public policies. Assessing the impact of such public policies requires sufficient data to include potential confounding factors in the analysis, such as coexisting medical conditions and socioeconomic status, and appropriate statistical and epidemiological techniques. This review will explore policies addressing three areas of perinatal medicine-elective deliveries prior to 39 weeks' gestation; perinatal regionalization; and mandatory paid maternity leave policies-to illustrate the challenges when assessing the impact of specific policies at the patient and population level. Data support the use of these policies to improve perinatal health, but with weaker and less certain effect sizes when compared to the initial patient-level studies. Improved data collection and epidemiological techniques will allow for improved assessment of these policies and the identification of potential areas of improvement when translating patient-level studies into public policies. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
Shannahoff-Khalsa, D S; Ray, L E; Levine, S; Gallen, C C; Schwartz, B J; Sidorowich, J J
1999-12-01
The objective of this study was to compare efficacy of two meditation protocols for treating patients with obsessive-compulsive disorder (OCD). Patients were randomized to two groups-matched for sex, age, and medication status-and blinded to the comparison protocol. They were told the trial would last for 12 months, unless one protocol proved to be more efficacious. If so, groups would merge, and the group that received the less efficacious treatment would also be afforded 12 months of the more effective one. The study was conducted at Children's Hospital, San Diego, Calif. Patients were selected according to Diagnostic and Statistical Manual of Mental Disorders, Third Edition-Revised (DSM-III-R) criteria and recruited by advertisements and referral. At baseline, Group 1 included 11 adults and 1 adolescent, and Group 2 included 10 adults. Group 1 employed a kundalini yoga meditation protocol and Group 2 employed the Relaxation Response plus Mindfulness Meditation technique. Baseline and 3-month interval testing was conducted using the Yale-Brown Obsessive Compulsive Scale (Y-BOCS), Symptoms Checklist-90-Revised Obsessive Compulsive (SCL-90-R OC) and Global Severity Index (SCL-90-R GSI) scales, Profile of Moods scale (POMS), Perceived Stress Scale (PSS), and Purpose in Life (PIL) test. Seven adults in each group completed 3 months of therapy. At 3 months, Group 1 demonstrated greater improvements (Student's independent groups t-test) on the Y-BOCS, SCL-90-R OC and GSI scales, and POMS, and greater but nonsignificant improvements on the PSS and PIL test. An intent-to-treat analysis (Y-BOCS) for the baseline and 3-month tests showed that only Group 1 improved. Within-group statistics (Student's paired t-tests) showed that Group 1 significantly improved on all six scales, but Group 2 had no improvements. Groups were merged for an additional year using Group 1 techniques. At 15 months, the final group (N=11) improved 71%, 62%, 66%, 74%, 39%, and 23%, respectively, on the Y-BOCS, SCL-90-R OC, SCL-90-R GSI, POMS, PSS, and PIL; P<0.003 (analysis of variance). This study demonstrates that kundalini yoga techniques are effective in the treatment of OCD.
The Role of Efficient XML Interchange (EXI) in Navy Wide-Area Network (WAN) Optimization
2015-03-01
compress, and re-encrypt data to continue providing optimization through compression; however, that capability requires careful consideration of...optimization 23 of encrypted data requires a careful analysis and comparison of performance improvements and IA vulnerabilities. It is important...Contained EXI capitalizes on multiple techniques to improve compression, and they vary depending on a set of EXI options passed to the codec
Automated Dental Epidemiology System. II. Systems Analysis and Functional Design,
1983-08-01
reduction of time and expense required for dental treatment and a minimization of patient time lost from military duties. Navy dentistry can thus be...regard, dental epidemiology can be especially valuable for evaluating and improving the Navy preventive dentistry program. It has been recommended that...processing applications to dentistry and dental epidemiology was performed. Alternative means to improve military dental epidemiology techniques and
Variance analysis refines overhead cost control.
Cooper, J C; Suver, J D
1992-02-01
Many healthcare organizations may not fully realize the benefits of standard cost accounting techniques because they fail to routinely report volume variances in their internal reports. If overhead allocation is routinely reported on internal reports, managers can determine whether billing remains current or lost charges occur. Healthcare organizations' use of standard costing techniques can lead to more realistic performance measurements and information system improvements that alert management to losses from unrecovered overhead in time for corrective action.
Computer Simulation For Design Of TWT's
NASA Technical Reports Server (NTRS)
Bartos, Karen F.; Fite, E. Brian; Shalkhauser, Kurt A.; Sharp, G. Richard
1992-01-01
A three-dimensional finite-element analytical technique facilitates design and fabrication of traveling-wave-tube (TWT) slow-wave structures. Used to perform thermal and mechanical analyses of TWT designed with variety of configurations, geometries, and materials. Using three-dimensional computer analysis, designer able to simulate building and testing of TWT, with consequent substantial saving of time and money. Technique enables detailed look into operation of traveling-wave tubes to help improve performance for future communications systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brady, D.N.; Church, B.W.; White, M.G.
Soil sampling activities during 1974 were concentrated in Area 5 of the Nevada Test Site (NTS). Area 5 has been assigned the highest priority because of the number of atmospheric test events held and a wide distribution of contaminants. Improved sampling techniques are described. Preliminary data analysis aided in designing a program to infer $sup 239-240$Pu results by Ge(Li) scanning techniques. (auth)
Woo, Jason R; Shikanov, Sergey; Zorn, Kevin C; Shalhav, Arieh L; Zagaja, Gregory P
2009-12-01
Posterior rhabdosphincter (PR) reconstruction during robot-assisted radical prostatectomy (RARP) was introduced in an attempt to improve postoperative continence. In the present study, we evaluate time to achieve continence in patients who are undergoing RARP with and without PR reconstruction. A prospective RARP database was searched for most recent cases that were accomplished with PR reconstruction (group 1, n = 69) or with standard technique (group 2, n = 63). We performed the analysis applying two definitions of continence: 0 pads per day or 0-1 security pad per day. Patients were evaluated by telephone interview. Statistical analysis was carried out using the Kaplan-Meier method and log-rank test. With PR reconstruction, continence was improved when defined as 0-1 security pad per day (median time of 90 vs 150 days; P = 0.01). This difference did not achieve statistical significance when continence was defined as 0 pads per day (P = 0.12). A statistically significant improvement in continence rate and time to achieve continence is seen in patients who are undergoing PR reconstruction during RARP, with continence defined as 0-1 security/safety pad per day. A larger, prospective and randomized study is needed to better understand the impact of this technique on postoperative continence.
Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert
2015-05-28
System codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the code and thereby improves its support for decision-making. The work reported here implements several improvements on classic calibration techniques afforded by modern analysis techniques. The key innovation has come from development of code surrogate model (or code emulator) construction and prediction algorithms. Use of a fast emulator makes the calibration processes used here withmore » Markov Chain Monte Carlo (MCMC) sampling feasible. This study uses Gaussian Process (GP) based emulators, which have been used previously to emulate computer codes in the nuclear field. The present work describes the formulation of an emulator that incorporates GPs into a factor analysis-type or pattern recognition-type model. This “function factorization” Gaussian Process (FFGP) model allows overcoming limitations present in standard GP emulators, thereby improving both accuracy and speed of the emulator-based calibration process. Calibration of a friction-factor example using a Method of Manufactured Solution is performed to illustrate key properties of the FFGP based process.« less
NASA Technical Reports Server (NTRS)
Iliff, K. W.; Maine, R. E.
1976-01-01
A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.
Nonlinear truncation error analysis of finite difference schemes for the Euler equations
NASA Technical Reports Server (NTRS)
Klopfer, G. H.; Mcrae, D. S.
1983-01-01
It is pointed out that, in general, dissipative finite difference integration schemes have been found to be quite robust when applied to the Euler equations of gas dynamics. The present investigation considers a modified equation analysis of both implicit and explicit finite difference techniques as applied to the Euler equations. The analysis is used to identify those error terms which contribute most to the observed solution errors. A technique for analytically removing the dominant error terms is demonstrated, resulting in a greatly improved solution for the explicit Lax-Wendroff schemes. It is shown that the nonlinear truncation errors are quite large and distributed quite differently for each of the three conservation equations as applied to a one-dimensional shock tube problem.
Translating Current Bioanalytical Techniques for Studying Corona Activity.
Wang, Chunming; Wang, Zhenzhen; Dong, Lei
2018-07-01
The recent discovery of the biological corona is revolutionising our understanding of the in vivo behaviour of nanomaterials. Accurate analysis of corona bioactivity is essential for predicting the fate of nanomaterials and thereby improving nanomedicine design. Nevertheless, current biotechniques for protein analysis are not readily adaptable for analysing corona proteins, given that their conformation, activity, and interaction may largely differ from those of the native proteins. Here, we introduce and propose tailor-made modifications to five types of mainstream bioanalytical methodologies. We specifically illustrate how these modifications can translate existing techniques for protein analysis into competent tools for dissecting the composition, bioactivity, and interaction (with both nanomaterials and the tissue) of corona formed on specific nanomaterial surfaces. Copyright © 2018 Elsevier Ltd. All rights reserved.
2014-01-01
This review aims to highlight the recent advances and methodological improvements in instrumental techniques applied for the analysis of different brominated flame retardants (BFRs). The literature search strategy was based on the recent analytical reviews published on BFRs. The main selection criteria involved the successful development and application of analytical methods for determination of the target compounds in various environmental matrices. Different factors affecting chromatographic separation and mass spectrometric detection of brominated analytes were evaluated and discussed. Techniques using advanced instrumentation to achieve outstanding results in quantification of different BFRs and their metabolites/degradation products were highlighted. Finally, research gaps in the field of BFR analysis were identified and recommendations for future research were proposed. PMID:27433482
Comparison of analysis and flight test data for a drone aircraft with active flutter suppression
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Pototzky, A. S.
1981-01-01
A drone aircraft equipped with an active flutter suppression system is considered with emphasis on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are given for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. The mathematical models are included and existing analytical techniques are described as well as an alternative analytical technique for obtaining closed-loop results.
Polarimetric Thomson scattering for high Te fusion plasmas
NASA Astrophysics Data System (ADS)
Giudicotti, L.
2017-11-01
Polarimetric Thomson scattering (TS) is a technique for the analysis of TS spectra in which the electron temperature Te is determined from the depolarization of the scattered radiation, a relativistic effect noticeable only in very hot (Te >= 10 keV) fusion plasmas. It has been proposed as a complementary technique to supplement the conventional spectral analysis in the ITER CPTS (Core Plasma Thomson Scattering) system for measurements in high Te, low ne plasma conditions. In this paper we review the characteristics of the depolarized TS radiation with special emphasis to the conditions of the ITER CPTS system and we describe a possible implementation of this diagnostic method suitable to significantly improve the performances of the conventional TS spectral analysis in the high Te range.
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
BATMAN: Bayesian Technique for Multi-image Analysis
NASA Astrophysics Data System (ADS)
Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.
2017-04-01
This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.
Cluster-based analysis improves predictive validity of spike-triggered receptive field estimates
Malone, Brian J.
2017-01-01
Spectrotemporal receptive field (STRF) characterization is a central goal of auditory physiology. STRFs are often approximated by the spike-triggered average (STA), which reflects the average stimulus preceding a spike. In many cases, the raw STA is subjected to a threshold defined by gain values expected by chance. However, such correction methods have not been universally adopted, and the consequences of specific gain-thresholding approaches have not been investigated systematically. Here, we evaluate two classes of statistical correction techniques, using the resulting STRF estimates to predict responses to a novel validation stimulus. The first, more traditional technique eliminated STRF pixels (time-frequency bins) with gain values expected by chance. This correction method yielded significant increases in prediction accuracy, including when the threshold setting was optimized for each unit. The second technique was a two-step thresholding procedure wherein clusters of contiguous pixels surviving an initial gain threshold were then subjected to a cluster mass threshold based on summed pixel values. This approach significantly improved upon even the best gain-thresholding techniques. Additional analyses suggested that allowing threshold settings to vary independently for excitatory and inhibitory subfields of the STRF resulted in only marginal additional gains, at best. In summary, augmenting reverse correlation techniques with principled statistical correction choices increased prediction accuracy by over 80% for multi-unit STRFs and by over 40% for single-unit STRFs, furthering the interpretational relevance of the recovered spectrotemporal filters for auditory systems analysis. PMID:28877194
Unsupervised classification of remote multispectral sensing data
NASA Technical Reports Server (NTRS)
Su, M. Y.
1972-01-01
The new unsupervised classification technique for classifying multispectral remote sensing data which can be either from the multispectral scanner or digitized color-separation aerial photographs consists of two parts: (a) a sequential statistical clustering which is a one-pass sequential variance analysis and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. Applications of the technique using an IBM-7094 computer on multispectral data sets over Purdue's Flight Line C-1 and the Yellowstone National Park test site have been accomplished. Comparisons between the classification maps by the unsupervised technique and the supervised maximum liklihood technique indicate that the classification accuracies are in agreement.
The Outlier Detection for Ordinal Data Using Scalling Technique of Regression Coefficients
NASA Astrophysics Data System (ADS)
Adnan, Arisman; Sugiarto, Sigit
2017-06-01
The aims of this study is to detect the outliers by using coefficients of Ordinal Logistic Regression (OLR) for the case of k category responses where the score from 1 (the best) to 8 (the worst). We detect them by using the sum of moduli of the ordinal regression coefficients calculated by jackknife technique. This technique is improved by scalling the regression coefficients to their means. R language has been used on a set of ordinal data from reference distribution. Furthermore, we compare this approach by using studentised residual plots of jackknife technique for ANOVA (Analysis of Variance) and OLR. This study shows that the jackknifing technique along with the proper scaling may lead us to reveal outliers in ordinal regression reasonably well.
Wang, J B; Jiang, W; Ji, Z; Cao, J Z; Liu, L P; Men, Y; Xu, C; Wang, X Z; Hui, Z G; Liang, J; Lyu, J M; Zhou, Z M; Xiao, Z F; Feng, Q F; Chen, D F; Zhang, H X; Yin, W B; Wang, L H
2016-08-01
This study aimed to evaluate the impact of technical advancement of radiation therapy in patients with LA-NSCLC receiving definitive radiotherapy (RT). Patients treated with definitive RT (≥50 Gy) between 2000 and 2010 were retrospectively reviewed. Overall survival (OS), cancer specific survival (CSS), locoregional progression-free survival (LRPFS), distant metastasis-free survival (DMFS) and progression-free survival (PFS) were calculated and compared among patients irradiated with different techniques. Radiation-induced lung injury (RILI) and esophageal injury (RIEI) were assessed according to the National Cancer Institute Common Terminology Criteria for Adverse Events 3.0 (NCI-CTCAE 3.0). A total of 946 patients were eligible for analysis, including 288 treated with two-dimensional radiotherapy (2D-RT), 209 with three-dimensional conformal radiation therapy (3D-CRT) and 449 with intensity-modulated radiation therapy (IMRT) respectively. The median follow-up time for the whole population was 84.1 months. The median OS of 2D-RT, 3D-CRT and IMRT groups were 15.8, 19.7 and 23.3 months, respectively, with the corresponding 5-year survival rate of 8.7%, 13.0% and 18.8%, respectively (P<0.001). The univariate analysis demonstrated significantly inferior OS, LRPFS, DMFS and PFS of 2D-RT than those provided by 3D-CRT or IMRT. The univariate analysis also revealed that the IMRT group had significantly loger LRPFS and a trend toward better OS and DMFS compared with 3D-CRT. Multivariate analysis showed that TNM stage, RT technique and KPS were independent factors correlated with all survival indexes. Compared with 2D-RT, the utilization of IMRT was associated with significantly improved OS, LRPFS, DMFS as well as PFS. Compared with 3D-CRT, IMRT provided superior DMFS (P=0.035), a trend approaching significance with regard to LRPFS (P=0.073) but no statistically significant improvement on OS, CSS and PFS in multivariate analysis. The incidence rates of RILI were significantly decreased in the IMRT group (29.3% vs. 26.6% vs.14.0%, P<0.001) whereas that of RIET rates were similar (34.7% vs. 29.7% vs. 35.3%, P=0.342) among the three groups. Radiation therapy technique is a factor affecting prognosis of LA-NSCLC patients. Advanced radiation therapy technique is associated with improved tumor control and survival, and decreased radiation-induced lung toxicity.
An Analysis of College Students' Attitudes towards Error Correction in EFL Context
ERIC Educational Resources Information Center
Zhu, Honglin
2010-01-01
This article is based on a survey on the attitudes towards the error correction by their teachers in the process of teaching and learning and it is intended to improve the language teachers' understanding of the nature of error correction. Based on the analysis, the article expounds some principles and techniques that can be applied in the process…
SOLVE II: A Technique to Improve Efficiency and Solve Problems in Hardwood Sawmills
Edward L. Adams; Daniel E. Dunmire
1977-01-01
The squeeze between rising costs and product values is getting tighter for sawmill managers. So, they are taking a closer took at the efficiency of their sawmills by making a complete analysis of their milling situation. Such an analysis requires considerable time and expense. To aid the manager with this task, the USDA Forest Service's Northeastern Forest...
Severe storms and local weather research
NASA Technical Reports Server (NTRS)
1981-01-01
Developments in the use of space related techniques to understand storms and local weather are summarized. The observation of lightning, storm development, cloud development, mesoscale phenomena, and ageostrophic circulation are discussed. Data acquisition, analysis, and the development of improved sensor and computer systems capability are described. Signal processing and analysis and application of Doppler lidar data are discussed. Progress in numerous experiments is summarized.
Lucky Imaging: Improved Localization Accuracy for Single Molecule Imaging
Cronin, Bríd; de Wet, Ben; Wallace, Mark I.
2009-01-01
We apply the astronomical data-analysis technique, Lucky imaging, to improve resolution in single molecule fluorescence microscopy. We show that by selectively discarding data points from individual single-molecule trajectories, imaging resolution can be improved by a factor of 1.6 for individual fluorophores and up to 5.6 for more complex images. The method is illustrated using images of fluorescent dye molecules and quantum dots, and the in vivo imaging of fluorescently labeled linker for activation of T cells. PMID:19348772
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthew R. Kumjian; Giangrande, Scott E.; Mishra, Subashree
Polarimetric radar observations increasingly are used to understand cloud microphysical processes, which is critical for improving their representation in cloud and climate models. In particular, there has been recent focus on improving representations of ice collection processes (e.g., aggregation, riming), as these influence precipitation rate, heating profiles, and ultimately cloud life cycles. However, distinguishing these processes using conventional polarimetric radar observations is difficult, as they produce similar fingerprints. This necessitates improved analysis techniques and integration of complementary data sources. Furthermore, the Midlatitude Continental Convective Clouds Experiment (MC3E) provided such an opportunity.
An analysis of aerodynamic requirements for coordinated bank-to-turn autopilots
NASA Technical Reports Server (NTRS)
Arrow, A.
1982-01-01
Two planar missile airframes were compared having the potential for improved bank-to-turn control but having different aerodynamic properties. The comparison was made with advanced level autopilots using both linear and nonlinear 3-D aerodynamic models to obtain realistic missile body angular rates and control surface incidence. Cortical cross-coupling effects are identified and desirable aerodynamics are recommended for improved coordinated (BTT) (CBTT) performance. In addition, recommendations are made for autopilot control law analyses and design techniques for improving CBTT performance.
A review of whole cell wall NMR by the direct-dissolution of biomass
Foston, Marcus B.; Samuel, Reichel; He, Jian; ...
2016-01-19
To fully realize the potential of lignocellulosic biomass as a renewable resource for the production of fuels, chemicals, and materials, an improved understanding of the chemical and molecular structures within biomass and how those structures are formed during biosynthesis and transformed during (thermochemical and biological) conversion must be developed. This effort will require analytical techniques which are not only in-depth, rapid, and cost-effective, but also leave native cell wall features intact. Whole plant cell wall nuclear magnetic resonance (NMR) analysis facilitates unparalleled structural characterization of lignocellulosic biomass without causing (or with minimal) structural modification. The objective of this review ismore » to summarize research pertaining to solution- or gel-state whole plant cell wall NMR analysis of biomass, demonstrating the capability of NMR to delineate the structural features and transformations of biomass. In particular, this review will focus on the application of a two-dimensional solution-state NMR technique and perdeuterated ionic liquid based organic electrolyte solvents for the direct dissolution and analysis of biomass. Furthermore, we believe this type of analysis will be critical to advancing biofuel research, improving bioprocessing methodology, and enhancing plant bioengineering efforts.« less
A review of whole cell wall NMR by the direct-dissolution of biomass
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foston, Marcus B.; Samuel, Reichel; He, Jian
To fully realize the potential of lignocellulosic biomass as a renewable resource for the production of fuels, chemicals, and materials, an improved understanding of the chemical and molecular structures within biomass and how those structures are formed during biosynthesis and transformed during (thermochemical and biological) conversion must be developed. This effort will require analytical techniques which are not only in-depth, rapid, and cost-effective, but also leave native cell wall features intact. Whole plant cell wall nuclear magnetic resonance (NMR) analysis facilitates unparalleled structural characterization of lignocellulosic biomass without causing (or with minimal) structural modification. The objective of this review ismore » to summarize research pertaining to solution- or gel-state whole plant cell wall NMR analysis of biomass, demonstrating the capability of NMR to delineate the structural features and transformations of biomass. In particular, this review will focus on the application of a two-dimensional solution-state NMR technique and perdeuterated ionic liquid based organic electrolyte solvents for the direct dissolution and analysis of biomass. Furthermore, we believe this type of analysis will be critical to advancing biofuel research, improving bioprocessing methodology, and enhancing plant bioengineering efforts.« less
Application of copulas to improve covariance estimation for partial least squares.
D'Angelo, Gina M; Weissfeld, Lisa A
2013-02-20
Dimension reduction techniques, such as partial least squares, are useful for computing summary measures and examining relationships in complex settings. Partial least squares requires an estimate of the covariance matrix as a first step in the analysis, making this estimate critical to the results. In addition, the covariance matrix also forms the basis for other techniques in multivariate analysis, such as principal component analysis and independent component analysis. This paper has been motivated by an example from an imaging study in Alzheimer's disease where there is complete separation between Alzheimer's and control subjects for one of the imaging modalities. This separation occurs in one block of variables and does not occur with the second block of variables resulting in inaccurate estimates of the covariance. We propose the use of a copula to obtain estimates of the covariance in this setting, where one set of variables comes from a mixture distribution. Simulation studies show that the proposed estimator is an improvement over the standard estimators of covariance. We illustrate the methods from the motivating example from a study in the area of Alzheimer's disease. Copyright © 2012 John Wiley & Sons, Ltd.
Recent progress and future directions in protein-protein docking.
Ritchie, David W
2008-02-01
This article gives an overview of recent progress in protein-protein docking and it identifies several directions for future research. Recent results from the CAPRI blind docking experiments show that docking algorithms are steadily improving in both reliability and accuracy. Current docking algorithms employ a range of efficient search and scoring strategies, including e.g. fast Fourier transform correlations, geometric hashing, and Monte Carlo techniques. These approaches can often produce a relatively small list of up to a few thousand orientations, amongst which a near-native binding mode is often observed. However, despite the use of improved scoring functions which typically include models of desolvation, hydrophobicity, and electrostatics, current algorithms still have difficulty in identifying the correct solution from the list of false positives, or decoys. Nonetheless, significant progress is being made through better use of bioinformatics, biochemical, and biophysical information such as e.g. sequence conservation analysis, protein interaction databases, alanine scanning, and NMR residual dipolar coupling restraints to help identify key binding residues. Promising new approaches to incorporate models of protein flexibility during docking are being developed, including the use of molecular dynamics snapshots, rotameric and off-rotamer searches, internal coordinate mechanics, and principal component analysis based techniques. Some investigators now use explicit solvent models in their docking protocols. Many of these approaches can be computationally intensive, although new silicon chip technologies such as programmable graphics processor units are beginning to offer competitive alternatives to conventional high performance computer systems. As cryo-EM techniques improve apace, docking NMR and X-ray protein structures into low resolution EM density maps is helping to bridge the resolution gap between these complementary techniques. The use of symmetry and fragment assembly constraints are also helping to make possible docking-based predictions of large multimeric protein complexes. In the near future, the closer integration of docking algorithms with protein interface prediction software, structural databases, and sequence analysis techniques should help produce better predictions of protein interaction networks and more accurate structural models of the fundamental molecular interactions within the cell.
A Comparative Study between Universal Eclectic Septoplasty Technique and Cottle
Amaral Neto, Odim Ferreira do; Mizoguchi, Flavio Massao; Freitas, Renato da Silva; Maniglia, João Jairney; Maniglia, Fábio Fabrício; Maniglia, Ricardo Fabrício
2017-01-01
Introduction Since the last century surgical correction of nasal septum deviation has been improved. The Universal Eclectic Technique was recently reported and there are still few studies dedicated to address this surgical approach. Objective The objective of this study is to compare the results of septal deviation correction achieved using the Universal Eclectic Technique (UET) with those obtained through Cottle's Technique. Methods This is a prospective study with two consecutive case series totaling 90 patients (40 women and 50 men), aged between 18 and 55 years. We divided patients into two groups according to the surgical approach. Fifty-three patients underwent septoplasty through Universal Eclectic Technique (UET) and thirty-seven patients were submitted to classical Cottle's septoplasty technique. All patients have answered the Nasal Obstruction Symptom Evaluation Scale (NOSE) questionnaire to assess pre and postoperative nasal obstruction. Results Statistical analysis showed a significantly shorter operating time for the UET group. Nasal edema assessment performed seven days after the surgery showed a prevalence of mild edema in UET group and moderate edema in Cottle's technique group. In regard to complication rates, UET presented a single case of septal hematoma while in Cottle's technique group we observed: 02 cases of severe edemas, 01 case of incapacitating headache, and 01 complaint of nasal pain. Conclusion The Universal Eclectic Technique (UET) has proven to be a safe and effective surgical technique with faster symptomatic improvement, low complication rates, and reduced surgical time when compared with classical Cottle's technique. PMID:28680499
NASA Astrophysics Data System (ADS)
Costa, Justin A.
The translocation of nucleic acid polymers across cell membranes is a fundamental requirement for complex life and has greatly contributed to genomic molecular evolution. The diversity of pathways that have evolved to transport DNA and RNA across membranes include protein receptors, active and passive transporters, endocytic and pinocytic processes, and various types of nucleic acid conducting channels known as nanopores. We have developed a series of experimental techniques, collectively known as "
Improved analysis of ground vibrations produced by man-made sources.
Ainalis, Daniel; Ducarne, Loïc; Kaufmann, Olivier; Tshibangu, Jean-Pierre; Verlinden, Olivier; Kouroussis, Georges
2018-03-01
Man-made sources of ground vibration must be carefully monitored in urban areas in order to ensure that structural damage and discomfort to residents is prevented or minimised. The research presented in this paper provides a comparative evaluation of various methods used to analyse a series of tri-axial ground vibration measurements generated by rail, road, and explosive blasting. The first part of the study is focused on comparing various techniques to estimate the dominant frequency, including time-frequency analysis. The comparative evaluation of the various methods to estimate the dominant frequency revealed that, depending on the method used, there can be significant variation in the estimates obtained. A new and improved analysis approach using the continuous wavelet transform was also presented, using the time-frequency distribution to estimate the localised dominant frequency and peak particle velocity. The technique can be used to accurately identify the level and frequency content of a ground vibration signal as it varies with time, and identify the number of times the threshold limits of damage are exceeded. Copyright © 2017 Elsevier B.V. All rights reserved.
Daxini, S D; Prajapati, J M
2014-01-01
Meshfree methods are viewed as next generation computational techniques. With evident limitations of conventional grid based methods, like FEM, in dealing with problems of fracture mechanics, large deformation, and simulation of manufacturing processes, meshfree methods have gained much attention by researchers. A number of meshfree methods have been proposed till now for analyzing complex problems in various fields of engineering. Present work attempts to review recent developments and some earlier applications of well-known meshfree methods like EFG and MLPG to various types of structure mechanics and fracture mechanics applications like bending, buckling, free vibration analysis, sensitivity analysis and topology optimization, single and mixed mode crack problems, fatigue crack growth, and dynamic crack analysis and some typical applications like vibration of cracked structures, thermoelastic crack problems, and failure transition in impact problems. Due to complex nature of meshfree shape functions and evaluation of integrals in domain, meshless methods are computationally expensive as compared to conventional mesh based methods. Some improved versions of original meshfree methods and other techniques suggested by researchers to improve computational efficiency of meshfree methods are also reviewed here.
Visual cluster analysis and pattern recognition template and methods
Osbourn, G.C.; Martinez, R.F.
1999-05-04
A method of clustering using a novel template to define a region of influence is disclosed. Using neighboring approximation methods, computation times can be significantly reduced. The template and method are applicable and improve pattern recognition techniques. 30 figs.
NASA Technical Reports Server (NTRS)
Rana, D. S.
1980-01-01
The data reduction capabilities of the current data reduction programs were assessed and a search for a more comprehensive system with higher data analytic capabilities was made. Results of the investigation are presented.
Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer
NASA Astrophysics Data System (ADS)
Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad
2017-04-01
Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.
Image Processing for Binarization Enhancement via Fuzzy Reasoning
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A. (Inventor)
2009-01-01
A technique for enhancing a gray-scale image to improve conversions of the image to binary employs fuzzy reasoning. In the technique, pixels in the image are analyzed by comparing the pixel's gray scale value, which is indicative of its relative brightness, to the values of pixels immediately surrounding the selected pixel. The degree to which each pixel in the image differs in value from the values of surrounding pixels is employed as the variable in a fuzzy reasoning-based analysis that determines an appropriate amount by which the selected pixel's value should be adjusted to reduce vagueness and ambiguity in the image and improve retention of information during binarization of the enhanced gray-scale image.
Berke, Ian M.; Miola, Joseph P.; David, Michael A.; Smith, Melanie K.; Price, Christopher
2016-01-01
In situ, cells of the musculoskeletal system reside within complex and often interconnected 3-D environments. Key to better understanding how 3-D tissue and cellular environments regulate musculoskeletal physiology, homeostasis, and health is the use of robust methodologies for directly visualizing cell-cell and cell-matrix architecture in situ. However, the use of standard optical imaging techniques is often of limited utility in deep imaging of intact musculoskeletal tissues due to the highly scattering nature of biological tissues. Drawing inspiration from recent developments in the deep-tissue imaging field, we describe the application of immersion based optical clearing techniques, which utilize the principle of refractive index (RI) matching between the clearing/mounting media and tissue under observation, to improve the deep, in situ imaging of musculoskeletal tissues. To date, few optical clearing techniques have been applied specifically to musculoskeletal tissues, and a systematic comparison of the clearing ability of optical clearing agents in musculoskeletal tissues has yet to be fully demonstrated. In this study we tested the ability of eight different aqueous and non-aqueous clearing agents, with RIs ranging from 1.45 to 1.56, to optically clear murine knee joints and cortical bone. We demonstrated and quantified the ability of these optical clearing agents to clear musculoskeletal tissues and improve both macro- and micro-scale imaging of musculoskeletal tissue across several imaging modalities (stereomicroscopy, spectroscopy, and one-, and two-photon confocal microscopy) and investigational techniques (dynamic bone labeling and en bloc tissue staining). Based upon these findings we believe that optical clearing, in combination with advanced imaging techniques, has the potential to complement classical musculoskeletal analysis techniques; opening the door for improved in situ investigation and quantification of musculoskeletal tissues. PMID:26930293
Berke, Ian M; Miola, Joseph P; David, Michael A; Smith, Melanie K; Price, Christopher
2016-01-01
In situ, cells of the musculoskeletal system reside within complex and often interconnected 3-D environments. Key to better understanding how 3-D tissue and cellular environments regulate musculoskeletal physiology, homeostasis, and health is the use of robust methodologies for directly visualizing cell-cell and cell-matrix architecture in situ. However, the use of standard optical imaging techniques is often of limited utility in deep imaging of intact musculoskeletal tissues due to the highly scattering nature of biological tissues. Drawing inspiration from recent developments in the deep-tissue imaging field, we describe the application of immersion based optical clearing techniques, which utilize the principle of refractive index (RI) matching between the clearing/mounting media and tissue under observation, to improve the deep, in situ imaging of musculoskeletal tissues. To date, few optical clearing techniques have been applied specifically to musculoskeletal tissues, and a systematic comparison of the clearing ability of optical clearing agents in musculoskeletal tissues has yet to be fully demonstrated. In this study we tested the ability of eight different aqueous and non-aqueous clearing agents, with RIs ranging from 1.45 to 1.56, to optically clear murine knee joints and cortical bone. We demonstrated and quantified the ability of these optical clearing agents to clear musculoskeletal tissues and improve both macro- and micro-scale imaging of musculoskeletal tissue across several imaging modalities (stereomicroscopy, spectroscopy, and one-, and two-photon confocal microscopy) and investigational techniques (dynamic bone labeling and en bloc tissue staining). Based upon these findings we believe that optical clearing, in combination with advanced imaging techniques, has the potential to complement classical musculoskeletal analysis techniques; opening the door for improved in situ investigation and quantification of musculoskeletal tissues.
Correlating the EMC analysis and testing methods for space systems in MIL-STD-1541A
NASA Technical Reports Server (NTRS)
Perez, Reinaldo J.
1990-01-01
A study was conducted to improve the correlation between the electromagnetic compatibility (EMC) analysis models stated in MIL-STD-1541A and the suggested testing methods used for space systems. The test and analysis methods outlined in MIL-STD-1541A are described, and a comparative assessment of testing and analysis techniques as they relate to several EMC areas is presented. Suggestions on present analysis and test methods are introduced to harmonize and bring the analysis and testing tools in MIL-STD-1541A into closer agreement. It is suggested that test procedures in MIL-STD-1541A must be improved by providing alternatives to the present use of shielded enclosures as the primary site for such tests. In addition, the alternate use of anechoic chambers and open field test sites must be considered.
Shuttle TPS thermal performance and analysis methodology
NASA Technical Reports Server (NTRS)
Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.
1983-01-01
Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.
Improving high resolution retinal image quality using speckle illumination HiLo imaging
Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew
2014-01-01
Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis. PMID:25136486
Improving high resolution retinal image quality using speckle illumination HiLo imaging.
Zhou, Xiaolin; Bedggood, Phillip; Metha, Andrew
2014-08-01
Retinal image quality from flood illumination adaptive optics (AO) ophthalmoscopes is adversely affected by out-of-focus light scatter due to the lack of confocality. This effect is more pronounced in small eyes, such as that of rodents, because the requisite high optical power confers a large dioptric thickness to the retina. A recently-developed structured illumination microscopy (SIM) technique called HiLo imaging has been shown to reduce the effect of out-of-focus light scatter in flood illumination microscopes and produce pseudo-confocal images with significantly improved image quality. In this work, we adopted the HiLo technique to a flood AO ophthalmoscope and performed AO imaging in both (physical) model and live rat eyes. The improvement in image quality from HiLo imaging is shown both qualitatively and quantitatively by using spatial spectral analysis.
Defective Reduction in Frozen Pie Manufacturing Process
NASA Astrophysics Data System (ADS)
Nooted, Oranuch; Tangjitsitcharoen, Somkiat
2017-06-01
The frozen pie production has a lot of defects resulting in high production cost. Failure mode and effect analysis (FMEA) technique has been applied to improve the frozen pie process. Pareto chart is also used to determine the major defects of frozen pie. There are 3 main processes that cause the defects which are the 1st freezing to glazing process, the forming process, and the folding process. The Risk Priority Number (RPN) obtained from FMEA is analyzed to reduce the defects. If RPN of each cause exceeds 45, the process will be considered to be improved and selected for the corrective and preventive actions. The results showed that RPN values decreased after the correction. Therefore, the implementation of FMEA technique can help to improve the performance of frozen pie process and reduce the defects approximately 51.9%.
An image analysis system for near-infrared (NIR) fluorescence lymph imaging
NASA Astrophysics Data System (ADS)
Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.
2011-03-01
Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.
Wright, Adam; Ricciardi, Thomas N.; Zwick, Martin
2005-01-01
The Medical Quality Improvement Consortium data warehouse contains de-identified data on more than 3.6 million patients including their problem lists, test results, procedures and medication lists. This study uses reconstructability analysis, an information-theoretic data mining technique, on the MQIC data warehouse to empirically identify risk factors for various complications of diabetes including myocardial infarction and microalbuminuria. The risk factors identified match those risk factors identified in the literature, demonstrating the utility of the MQIC data warehouse for outcomes research, and RA as a technique for mining clinical data warehouses. PMID:16779156
Progress in multidisciplinary design optimization at NASA Langley
NASA Technical Reports Server (NTRS)
Padula, Sharon L.
1993-01-01
Multidisciplinary Design Optimization refers to some combination of disciplinary analyses, sensitivity analysis, and optimization techniques used to design complex engineering systems. The ultimate objective of this research at NASA Langley Research Center is to help the US industry reduce the costs associated with development, manufacturing, and maintenance of aerospace vehicles while improving system performance. This report reviews progress towards this objective and highlights topics for future research. Aerospace design problems selected from the author's research illustrate strengths and weaknesses in existing multidisciplinary optimization techniques. The techniques discussed include multiobjective optimization, global sensitivity equations and sequential linear programming.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
REX2000 Version 2.5: Improved DATA Handling and Enhanced User-Interface
NASA Astrophysics Data System (ADS)
Taguchi, Takeyoshi
2007-02-01
XAFS analysis can be applied to various fields such as material science, environmental study, biological science, etc. and is widely used for characterization in those fields. In the early days that XAFS technique was started to be used, scientists wrote their own code for XAFS data analysis. As XAFS technique became very popular and XAFS community grew big, a several analysis code or package had been developed and submitted for people to use. The REX2000 is one of those XAFS analysis packages, which is commercially available. Counting up from its predecessor "REX", REX2000 has been used for more than 15 years in XAFS society. From the previous modification in 2003, a major change was made in this year of 2006. For a dynamical study of advanced material, many XAFS DATA were measured (quick XAFS and in-situ XAFS) and hundreds of DATA sets need to be processed. The REX2000's DATA handling is improved to cope with those huge volume DATA at once and report the fitting result as CSV file. Well-established user-interface is enhanced so that user can customize initial values for data analysis and specify the options through graphical interface. Many small changes are made and described in this paper.
NASA Astrophysics Data System (ADS)
Maity, Debotyam
This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.
NASA Astrophysics Data System (ADS)
Shi, J. T.; Han, X. T.; Xie, J. F.; Yao, L.; Huang, L. T.; Li, L.
2013-03-01
A Pulsed High Magnetic Field Facility (PHMFF) has been established in Wuhan National High Magnetic Field Center (WHMFC) and various protection measures are applied in its control system. In order to improve the reliability and robustness of the control system, the safety analysis of the PHMFF is carried out based on Fault Tree Analysis (FTA) technique. The function and realization of 5 protection systems, which include sequence experiment operation system, safety assistant system, emergency stop system, fault detecting and processing system and accident isolating protection system, are given. The tests and operation indicate that these measures improve the safety of the facility and ensure the safety of people.
Laser-induced fluorescence spectroscopy for improved chemical analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelbwachs, J.A.
1983-09-01
This report summarizes the progress achieved over the past five years in the laser-induced fluorescence spectroscopy (LIFS) for improved chemical analysis program. Our initial efforts yielded significantly lower detection limits for trace elemental analysis by the use of both cw and pulsed laser excitations. New methods of LIFS were developed that were shown to overcome many of the traditional limitations to LIFS techniques. LIFS methods have been applied to yield fundamental scientific data that further the understanding of forces between atoms and other atoms and molecules. In recent work, two-photon ionization was combined with LIFS and applied, for the firstmore » time, to the study of energy transfer in ions.« less
Using real options analysis to support strategic management decisions
NASA Astrophysics Data System (ADS)
Kabaivanov, Stanimir; Markovska, Veneta; Milev, Mariyan
2013-12-01
Decision making is a complex process that requires taking into consideration multiple heterogeneous sources of uncertainty. Standard valuation and financial analysis techniques often fail to properly account for all these sources of risk as well as for all sources of additional flexibility. In this paper we explore applications of a modified binomial tree method for real options analysis (ROA) in an effort to improve decision making process. Usual cases of use of real options are analyzed with elaborate study on the applications and advantages that company management can derive from their application. A numeric results based on extending simple binomial tree approach for multiple sources of uncertainty are provided to demonstrate the improvement effects on management decisions.
NASA Technical Reports Server (NTRS)
Drake, R. L.; Duvoisin, P. F.; Asthana, A.; Mather, T. W.
1971-01-01
High speed automated identification and design of dynamic systems, both linear and nonlinear, are discussed. Special emphasis is placed on developing hardware and techniques which are applicable to practical problems. The basic modeling experiment and new results are described. Using the improvements developed successful identification of several systems, including a physical example as well as simulated systems, was obtained. The advantages of parameter signature analysis over signal signature analysis in go-no go testing of operational systems were demonstrated. The feasibility of using these ideas in failure mode prediction in operating systems was also investigated. An improved digital controlled nonlinear function generator was developed, de-bugged, and completely documented.
Why Is Rainfall Error Analysis Requisite for Data Assimilation and Climate Modeling?
NASA Technical Reports Server (NTRS)
Hou, Arthur Y.; Zhang, Sara Q.
2004-01-01
Given the large temporal and spatial variability of precipitation processes, errors in rainfall observations are difficult to quantify yet crucial to making effective use of rainfall data for improving atmospheric analysis, weather forecasting, and climate modeling. We highlight the need for developing a quantitative understanding of systematic and random errors in precipitation observations by examining explicit examples of how each type of errors can affect forecasts and analyses in global data assimilation. We characterize the error information needed from the precipitation measurement community and how it may be used to improve data usage within the general framework of analysis techniques, as well as accuracy requirements from the perspective of climate modeling and global data assimilation.
Land border monitoring with remote sensing technologies
NASA Astrophysics Data System (ADS)
Malinowski, Radoslaw
2010-09-01
The remote sensing technology has many practical applications in different fields of science and industry. There is also a need to examine its usefulness for the purpose of land border surveillance. This research started with analysis of potential direct use of Earth Observation technology for monitoring migrations of people and preventing smuggling. The research, however, proved that there are still many fields within which the EO technology needs to be improved. From that point the analysis focused on improving Border Permeability Index which utilizes EO techniques as a source of information. The result of BPI analysis with use of high resolution data provides new kind of information which can support and make more effective work of authorities from security domain.
Comparison of analysis and flight test data for a drone aircraft with active flutter suppression
NASA Technical Reports Server (NTRS)
Newsom, J. R.; Pototzky, A. S.
1981-01-01
This paper presents a comparison of analysis and flight test data for a drone aircraft equipped with an active flutter suppression system. Emphasis is placed on the comparison of modal dampings and frequencies as a function of Mach number. Results are presented for both symmetric and antisymmetric motion with flutter suppression off. Only symmetric results are presented for flutter suppression on. Frequency response functions of the vehicle are presented from both flight test data and analysis. The analysis correlation is improved by using an empirical aerodynamic correction factor which is proportional to the ratio of experimental to analytical steady-state lift curve slope. In addition to presenting the mathematical models and a brief description of existing analytical techniques, an alternative analytical technique for obtaining closed-loop results is presented.
Benchtop Antigen Detection Technique using Nanofiltration and Fluorescent Dyes
NASA Technical Reports Server (NTRS)
Scardelletti, Maximilian C.; Varaljay, Vanessa
2009-01-01
The designed benchtop technique is primed to detect bacteria and viruses from antigenic surface marker proteins in solutions, initially water. This inclusive bio-immunoassay uniquely combines nanofiltration and near infrared (NIR) dyes conjugated to antibodies to isolate and distinguish microbial antigens, using laser excitation and spectrometric analysis. The project goals include detecting microorganisms aboard the International Space Station, space shuttle, Crew Exploration Vehicle (CEV), and human habitats on future Moon and Mars missions, ensuring astronaut safety. The technique is intended to improve and advance water contamination testing both commercially and environmentally as well. Lastly, this streamlined technique poses to greatly simplify and expedite testing of pathogens in complex matrices, such as blood, in hospital and laboratory clinics.
Multiscale wavelet representations for mammographic feature analysis
NASA Astrophysics Data System (ADS)
Laine, Andrew F.; Song, Shuwu
1992-12-01
This paper introduces a novel approach for accomplishing mammographic feature analysis through multiresolution representations. We show that efficient (nonredundant) representations may be identified from digital mammography and used to enhance specific mammographic features within a continuum of scale space. The multiresolution decomposition of wavelet transforms provides a natural hierarchy in which to embed an interactive paradigm for accomplishing scale space feature analysis. Choosing wavelets (or analyzing functions) that are simultaneously localized in both space and frequency, results in a powerful methodology for image analysis. Multiresolution and orientation selectivity, known biological mechanisms in primate vision, are ingrained in wavelet representations and inspire the techniques presented in this paper. Our approach includes local analysis of complete multiscale representations. Mammograms are reconstructed from wavelet coefficients, enhanced by linear, exponential and constant weight functions localized in scale space. By improving the visualization of breast pathology we can improve the changes of early detection of breast cancers (improve quality) while requiring less time to evaluate mammograms for most patients (lower costs).
Studies on spectral analysis of randomly sampled signals: Application to laser velocimetry data
NASA Technical Reports Server (NTRS)
Sree, David
1992-01-01
Spectral analysis is very useful in determining the frequency characteristics of many turbulent flows, for example, vortex flows, tail buffeting, and other pulsating flows. It is also used for obtaining turbulence spectra from which the time and length scales associated with the turbulence structure can be estimated. These estimates, in turn, can be helpful for validation of theoretical/numerical flow turbulence models. Laser velocimetry (LV) is being extensively used in the experimental investigation of different types of flows, because of its inherent advantages; nonintrusive probing, high frequency response, no calibration requirements, etc. Typically, the output of an individual realization laser velocimeter is a set of randomly sampled velocity data. Spectral analysis of such data requires special techniques to obtain reliable estimates of correlation and power spectral density functions that describe the flow characteristics. FORTRAN codes for obtaining the autocorrelation and power spectral density estimates using the correlation-based slotting technique were developed. Extensive studies have been conducted on simulated first-order spectrum and sine signals to improve the spectral estimates. A first-order spectrum was chosen because it represents the characteristics of a typical one-dimensional turbulence spectrum. Digital prefiltering techniques, to improve the spectral estimates from randomly sampled data were applied. Studies show that the spectral estimates can be increased up to about five times the mean sampling rate.
Spacecraft Multiple Array Communication System Performance Analysis
NASA Technical Reports Server (NTRS)
Hwu, Shian U.; Desilva, Kanishka; Sham, Catherine C.
2010-01-01
The Communication Systems Simulation Laboratory (CSSL) at the NASA Johnson Space Center is tasked to perform spacecraft and ground network communication system simulations, design validation, and performance verification. The CSSL has developed simulation tools that model spacecraft communication systems and the space and ground environment in which the tools operate. In this paper, a spacecraft communication system with multiple arrays is simulated. Multiple array combined technique is used to increase the radio frequency coverage and data rate performance. The technique is to achieve phase coherence among the phased arrays to combine the signals at the targeting receiver constructively. There are many technical challenges in spacecraft integration with a high transmit power communication system. The array combining technique can improve the communication system data rate and coverage performances without increasing the system transmit power requirements. Example simulation results indicate significant performance improvement can be achieved with phase coherence implementation.
NASA Astrophysics Data System (ADS)
Alves de Mesquita, Jayme; Lopes de Melo, Pedro
2004-03-01
Thermally sensitive devices—thermistors—have usually been used to monitor sleep-breathing disorders. However, because of their long time constant, these devices are not able to provide a good characterization of fast events, like hypopneas. Nasal pressure recording technique (NPR) has recently been suggested to quantify airflow during sleep. It is claimed that the short time constants of the devices used to implement this technique would allow an accurate analysis of fast abnormal respiratory events. However, these devices present errors associated with nonlinearities and acoustic resonance that could reduce the diagnostic value of the NPR. Moreover, in spite of the high scientific and clinical potential, there is no detailed description of a complete instrumentation system to implement this promising technique in sleep studies. In this context, the purpose of this work was twofold: (1) describe the development of a flexible NPR device and (2) evaluate the performance of this device when compared to pneumotachographs (PNTs) and thermistors. After the design details are described, the system static accuracy is evaluated by a comparative analysis with a PNT. This analysis revealed a significant reduction (p<0.001) of the static error when system nonlinearities were reduced. The dynamic performance of the NPR system was investigated by frequency response analysis and time constant evaluations and the results showed that the developed device response was as good as PNT and around 100 times faster (τ=5,3 ms) than thermistors (τ=512 ms). Experimental results obtained in simulated clinical conditions and in a patient are presented as examples, and confirmed the good features achieved in engineering tests. These results are in close agreement with physiological fundamentals, supplying substantial evidence that the improved dynamic and static characteristics of this device can contribute to a more accurate implementation of medical research projects and to improve the diagnoses of sleep-breathing disorders.
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
Analytical cytology applied to detection of induced cytogenetic abnormalities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gray, J.W.; Lucas, J.; Straume, T.
1987-08-06
Radiation-induced biological damage results in formation of a broad spectrum of cytogenetic changes such as translocations, dicentrics, ring chromosomes, and acentric fragments. A battery of analytical cytologic techniques are now emerging that promise to significantly improve the precision and ease with which these radiation induced cytogenetic changes can be quantified. This report summarizes techniques to facilitate analysis of the frequency of occurrence of structural and numerical aberrations in control and irradiated human cells. 14 refs., 2 figs.
The use of the modified Cholesky decomposition in divergence and classification calculations
NASA Technical Reports Server (NTRS)
Vanroony, D. L.; Lynn, M. S.; Snyder, C. H.
1973-01-01
The use of the Cholesky decomposition technique is analyzed as applied to the feature selection and classification algorithms used in the analysis of remote sensing data (e.g. as in LARSYS). This technique is approximately 30% faster in classification and a factor of 2-3 faster in divergence, as compared with LARSYS. Also numerical stability and accuracy are slightly improved. Other methods necessary to deal with numerical stablity problems are briefly discussed.
The use of the modified Cholesky decomposition in divergence and classification calculations
NASA Technical Reports Server (NTRS)
Van Rooy, D. L.; Lynn, M. S.; Snyder, C. H.
1973-01-01
This report analyzes the use of the modified Cholesky decomposition technique as applied to the feature selection and classification algorithms used in the analysis of remote sensing data (e.g., as in LARSYS). This technique is approximately 30% faster in classification and a factor of 2-3 faster in divergence, as compared with LARSYS. Also numerical stability and accuracy are slightly improved. Other methods necessary to deal with numerical stability problems are briefly discussed.
Productivity improvement and quality enhancement at NASA
NASA Technical Reports Server (NTRS)
Braunstein, D. R.
1985-01-01
NASA's Productivity Improvement and Quality Enhancement (PIQE) effort has as its objectives the encouragement of greater employee participation in management decision-making and the identification of impediments as well as opportunities for high productivity. Attempts are also made to try out novel management practices, and to evolve productivity trend analysis techniques. Every effort is made to note, reward, and diffuse successfully instituted PIQE approaches throughout the NASA-contractor organization.
US Army Institute of Dental Research Annual Progress Report FY80.
1980-10-01
indicates that use of the laser technique does result in increased connectiv tissue regeneration and improved resolution of the periodontal defects...connectiv tissue regeneration and improved resolution of the periodontal defects. Final analysis of histologic data on the use of the neodymium laser for in...and 21 Bone Regeneration in Trnumatic Wounds (Pathology) DA OH 6038 Development of Endodontic Procedures for 22 Military Dentistry ( Oral Biology) DA OK
On the Power of Abstract Interpretation
NASA Technical Reports Server (NTRS)
Reddy, Uday S.; Kamin, Samuel N.
1991-01-01
Increasingly sophisticated applications of static analysis place increased burden on the reliability of the analysis techniques. Often, the failure of the analysis technique to detect some information my mean that the time or space complexity of the generated code would be altered. Thus, it is important to precisely characterize the power of static analysis techniques. We follow the approach of Selur et. al. who studied the power of strictness analysis techniques. Their result can be summarized by saying 'strictness analysis is perfect up to variations in constants.' In other words, strictness analysis is as good as it could be, short of actually distinguishing between concrete values. We use this approach to characterize a broad class of analysis techniques based on abstract interpretation including, but not limited to, strictness analysis. For the first-order case, we consider abstract interpretations where the abstract domain for data values is totally ordered. This condition is satisfied by Mycroft's strictness analysis that of Sekar et. al. and Wadler's analysis of list-strictness. For such abstract interpretations, we show that the analysis is complete in the sense that, short of actually distinguishing between concrete values with the same abstraction, it gives the best possible information. We further generalize these results to typed lambda calculus with pairs and higher-order functions. Note that products and function spaces over totally ordered domains are not totally ordered. In fact, the notion of completeness used in the first-order case fails if product domains or function spaces are added. We formulate a weaker notion of completeness based on observability of values. Two values (including pairs and functions) are considered indistinguishable if their observable components are indistinguishable. We show that abstract interpretation of typed lambda calculus programs is complete up to this notion of indistinguishability. We use denotationally-oriented arguments instead of the detailed operational arguments used by Selur et. al.. Hence, our proofs are much simpler. They should be useful for further future improvements.
NASA Astrophysics Data System (ADS)
Mushlihuddin, R.; Nurafifah; Irvan
2018-01-01
The student’s low ability in mathematics problem solving proved to the less effective of a learning process in the classroom. Effective learning was a learning that affects student’s math skills, one of which is problem-solving abilities. Problem-solving capability consisted of several stages: understanding the problem, planning the settlement, solving the problem as planned, re-examining the procedure and the outcome. The purpose of this research was to know: (1) was there any influence of PBL model in improving ability Problem solving of student math in a subject of vector analysis?; (2) was the PBL model effective in improving students’ mathematical problem-solving skills in vector analysis courses? This research was a quasi-experiment research. The data analysis techniques performed from the test stages of data description, a prerequisite test is the normality test, and hypothesis test using the ANCOVA test and Gain test. The results showed that: (1) there was an influence of PBL model in improving students’ math problem-solving abilities in vector analysis courses; (2) the PBL model was effective in improving students’ problem-solving skills in vector analysis courses with a medium category.
Combining results of multiple search engines in proteomics.
Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W
2013-09-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.
Combining Results of Multiple Search Engines in Proteomics*
Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.
2013-01-01
A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762
A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde
NASA Astrophysics Data System (ADS)
Cofer, Wesley R.; Edahl, Robert A.
This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH 2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g. CH 2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH 2O at global background levels (˜ 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH 2O, once concentrated, is accomplished using high performance liquid chromatography (HPLC) with ultraviolet photometric detection. The CH 2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H 2SO 4 acidified aqueous solution, is detected as CH 2O.
A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde
NASA Technical Reports Server (NTRS)
Cofer, W. R., III; Edahl, R. A., Jr.
1986-01-01
This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g., CH2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH2O at global background levels (about 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH2O, once concentrated, is accomplished using high performance liquid chromatography with ultraviolet photometric detection. The CH2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H2SO4 acidified aqueous solution, is detected as CH2O.
Using Meta Analysis Techniques to Assess the Safety Effect of Red Light Running Cameras
DOT National Transportation Integrated Search
2002-02-01
Automated enforcement programs, including automated systems that are used to enforce red light running violations, have recently come under scrutiny regarding their value in terms of improving safety, their primary purpose. One of the major hurdles t...
Summit Station Skiway Cost Analysis
2016-07-01
Laboratory (CRREL) U.S. Army Engineer Research and Development Center (ERDC) 72 Lyme Road Hanover, NH 03755-1290 Final Report Approved for...cargo loads. To explore further skiway improvement and cost saving techniques, this report reviews alternative maintenance and construction options...3 2.2 Maintenance
Energy Efficiency in Libraries.
ERIC Educational Resources Information Center
Lewis, Eleanor J.; And Others
1993-01-01
Shows how libraries can save money and energy with energy-efficient technologies, improving maintenance, and encouraging staff efforts to conserve energy. Specific techniques such as life-cycle cost analysis and energy audits focusing on lighting, heating, ventilation, air conditioning, and water efficiency are described. Funding options and…
Deep Lake Explorer: Using citizen science to analyze underwater video from the Great Lakes
While underwater video collection technology continues to improve, advancements in underwater video analysis techniques have lagged. Crowdsourcing image interpretation using the Zooniverse platform has proven successful for many projects, but few projects to date have included vi...
Designing Interactive Online Nursing Courses
ERIC Educational Resources Information Center
Jain, Smita; Jain, Pawan
2015-01-01
This study empirically tests the relation between the instructional design elements and the overall meaningful interactions among online students. Eighteen online graduate nursing courses are analyzed using bivariate and multivariate analysis techniques. Findings suggest that the quantity of meaningful interaction among learners can be improved by…
Conceptual Model Evaluation using Advanced Parameter Estimation Techniques with Heat as a Tracer
NASA Astrophysics Data System (ADS)
Naranjo, R. C.; Morway, E. D.; Healy, R. W.
2016-12-01
Temperature measurements made at multiple depths beneath the sediment-water interface has proven useful for estimating seepage rates from surface-water channels and corresponding subsurface flow direction. Commonly, parsimonious zonal representations of the subsurface structure are defined a priori by interpretation of temperature envelopes, slug tests or analysis of soil cores. However, combining multiple observations into a single zone may limit the inverse model solution and does not take full advantage of the information content within the measured data. Further, simulating the correct thermal gradient, flow paths, and transient behavior of solutes may be biased by inadequacies in the spatial description of subsurface hydraulic properties. The use of pilot points in PEST offers a more sophisticated approach to estimate the structure of subsurface heterogeneity. This presentation evaluates seepage estimation in a cross-sectional model of a trapezoidal canal with intermittent flow representing four typical sedimentary environments. The recent improvements in heat as a tracer measurement techniques (i.e. multi-depth temperature probe) along with use of modern calibration techniques (i.e., pilot points) provides opportunities for improved calibration of flow models, and, subsequently, improved model predictions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Credille, Jennifer; Owens, Elizabeth
This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restrictedmore » to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.« less
Gershengorn, Hayley B; Kocher, Robert; Factor, Phillip
2014-02-01
The business community has developed strategies to ensure the quality of the goods or services they produce and to improve the management of multidisciplinary work teams. With modification, many of these techniques can be imported into intensive care units (ICUs) to improve clinical operations and patient safety. In Part I of a three-part ATS Seminar series, we argue for adopting business management strategies in ICUs and set forth strategies for targeting selected quality improvement initiatives. These tools are relevant to health care today as focus is placed on limiting low-value care and measuring, reporting, and improving quality. In the ICU, the complexity of illness and the need to standardize processes make these tools even more appealing. Herein, we highlight four techniques to help prioritize initiatives. First, the "80/20 rule" mandates focus on the few (20%) interventions likely to drive the majority (80%) of improvement. Second, benchmarking--a process of comparison with peer units or institutions--is essential to identifying areas of strength and weakness. Third, root cause analyses, in which structured retrospective reviews of negative events are performed, can be used to identify and fix systems errors. Finally, failure mode and effects analysis--a process aimed at prospectively identifying potential sources of error--allows for systems fixes to be instituted in advance to prevent negative outcomes. These techniques originated in fields other than health care, yet adoption has and can help ICU managers prioritize issues for quality improvement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, D.W.; Selvin, S.; Close, E.R.
In studying geographic disease distributions, one normally compares rates of arbitrarily defined geographic subareas (e.g. census tracts), thereby sacrificing the geographic detail of the original data. The sparser the data, the larger the subareas must be in order to calculate stable rates. This dilemma is avoided with the technique of Density Equalizing Map Projections (DEMP). Boundaries of geographic subregions are adjusted to equalize population density over the entire study area. Case locations plotted on the transformed map should have a uniform distribution if the underlying disease-rates are constant. On the transformed map, the statistical analysis of the observed distribution ismore » greatly simplified. Even for sparse distributions, the statistical significance of a supposed disease cluster can be reliably calculated. The present report describes the first successful application of the DEMP technique to a sizeable ``real-world`` data set of epidemiologic interest. An improved DEMP algorithm [GUSE93, CLOS94] was applied to a data set previously analyzed with conventional techniques [SATA90, REYN91]. The results from the DEMP analysis and a conventional analysis are compared.« less
LEAN SIX SIGMA TECHNIQUES TO IMPROVE OPHTHALMOLOGY CLINIC EFFICIENCY.
Ciulla, Thomas A; Tatikonda, Mohan V; ElMaraghi, Yehya A; Hussain, Rehan M; Hill, Amanda L; Clary, Julie M; Hattab, Eyas
2017-07-18
Ophthalmologists serve an increasing volume of a growing elderly population undergoing increasingly complex outpatient medical care, including extensive diagnostic testing and treatment. The resulting prolonged patient visit times ("patient flow times") limit quality, patient and employee satisfaction, and represent waste. Lean Six Sigma process improvement was used in a vitreoretinal practice to decrease patient flow time, demonstrating that this approach can yield significant improvement in health care. Process flow maps were created to determine the most common care pathways within clinic. Three months' visits from the electronic medical record system, which tracks patient task times at each process step in the office were collected. Care tasks and care pathways consuming the greatest time and variation were identified and modified. Follow-up analysis from 6 weeks' visits was conducted to assess improvement. Nearly all patients took one of five paths through the office. Patient flow was redesigned to reduce waiting room time by having staff members immediately start patients into one of those five paths; staffing was adjusted to address high demand tasks, and scheduling was optimized around derived predictors of patient flow times. Follow-up analysis revealed a statistically significant decline in mean patient flow time by 18% and inpatient flow time SD by 4.6%. Patient and employee satisfaction scores improved. Manufacturing industry techniques, such as Lean and Six Sigma, can be used to improve patient care, minimize waste, and enhance patient and staff satisfaction in outpatient clinics.
ERIC Educational Resources Information Center
Mungin, Michael
2017-01-01
In the five years following implementation of a chat reference service at James Madison University (JMU), the service proved very popular but was not closely assessed for quality of service. Using grounded theory and qualitative data analysis techniques, a comprehensive assessment effort was begun in earnest and is in progress. Preliminary results…
Dark matter constraints from a joint analysis of dwarf Spheroidal galaxy observations with VERITAS
Archambault, S.; Archer, A.; Benbow, W.; ...
2017-04-05
We present constraints on the annihilation cross section of weakly interacting massive particles dark matter based on the joint statistical analysis of four dwarf galaxies with VERITAS. These results are derived from an optimized photon weighting statistical technique that improves on standard imaging atmospheric Cherenkov telescope (IACT) analyses by utilizing the spectral and spatial properties of individual photon events.
Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions
NASA Technical Reports Server (NTRS)
Dobyns, Alan; Saff, Charles; Johns, Robert
1992-01-01
An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.
Life Cycle Costing: A Working Level Approach
1981-06-01
Effects Analysis ( FMEA ) ...... ................ .. 59 Logistics Performance Factors (LPFs) 60 Planning the Use of Life Cycle Cost in the Demonstration...form. Failure Mode and Effects Analysis ( FMEA ). Description. FMEA is a technique that attempts to improve the design of any particular unit. The FMEA ...failure modes and also eliminate extra parts or ones that are used to achieve more performance than is necessary (16:5-14]. Advantages. FMEA forces
Infrared spectroscopy as a screening technique for colitis
NASA Astrophysics Data System (ADS)
Titus, Jitto; Ghimire, Hemendra; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil
2017-05-01
There remains a great need for diagnosis of inflammatory bowel disease (IBD), for which the current technique, colonoscopy, is not cost-effective and presents a non-negligible risk for complications. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy is a new screening technique to evaluate colitis. Comparing infrared spectra of sera to study the differences between them can prove challenging due to the complexity of its biological constituents giving rise to a plethora of vibrational modes. Overcoming these inherent infrared spectral analysis difficulties involving highly overlapping absorbance peaks and the analysis of the data by curve fitting to improve the resolution is discussed. The proposed technique uses colitic and normal wild type mice dried serum to obtain ATR/FTIR spectra to effectively differentiate colitic mice from normal mice. Using this method, Amide I group frequency (specifically, alpha helix to beta sheet ratio of the protein secondary structure) was identified as disease associated spectral signature in addition to the previously reported glucose and mannose signatures in sera of chronic and acute mice models of colitis. Hence, this technique will be able to identify changes in the sera due to various diseases.
Enhanced automatic artifact detection based on independent component analysis and Renyi's entropy.
Mammone, Nadia; Morabito, Francesco Carlo
2008-09-01
Artifacts are disturbances that may occur during signal acquisition and may affect their processing. The aim of this paper is to propose a technique for automatically detecting artifacts from the electroencephalographic (EEG) recordings. In particular, a technique based on both Independent Component Analysis (ICA) to extract artifactual signals and on Renyi's entropy to automatically detect them is presented. This technique is compared to the widely known approach based on ICA and the joint use of kurtosis and Shannon's entropy. The novel processing technique is shown to detect on average 92.6% of the artifactual signals against the average 68.7% of the previous technique on the studied available database. Moreover, Renyi's entropy is shown to be able to detect muscle and very low frequency activity as well as to discriminate them from other kinds of artifacts. In order to achieve an efficient rejection of the artifacts while minimizing the information loss, future efforts will be devoted to the improvement of blind artifact separation from EEG in order to ensure a very efficient isolation of the artifactual activity from any signals deriving from other brain tasks.
Quantitative CT: technique dependence of volume estimation on pulmonary nodules
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan
2012-03-01
Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.
The impact of varicocelectomy on sperm parameters: a meta-analysis.
Schauer, Ingrid; Madersbacher, Stephan; Jost, Romy; Hübner, Wilhelm Alexander; Imhof, Martin
2012-05-01
We determined the impact of 3 surgical techniques (high ligation, inguinal varicocelectomy and the subinguinal approach) for varicocelectomy on sperm parameters (count and motility) and pregnancy rates. By searching the literature using MEDLINE and the Cochrane Library with the last search performed in February 2011, focusing on the last 20 years, a total of 94 articles published between 1975 and 2011 reporting on sperm parameters before and after varicocelectomy were identified. Inclusion criteria for this meta-analysis were at least 2 semen analyses (before and 3 or more months after the procedure), patient age older than 19 years, clinical subfertility and/or abnormal semen parameters, and a clinically palpable varicocele. To rule out skewing factors a bias analysis was performed, and statistical analysis was done with RevMan5(®) and SPSS 15.0(®). A total of 14 articles were included in the statistical analysis. All 3 surgical approaches led to significant or highly significant postoperative improvement of both parameters with only slight numeric differences among the techniques. This difference did not reach statistical significance for sperm count (p = 0.973) or sperm motility (p = 0.372). After high ligation surgery sperm count increased by 10.85 million per ml (p = 0.006) and motility by 6.80% (p <0.00001) on the average. Inguinal varicocelectomy led to an improvement in sperm count of 7.17 million per ml (p <0.0001) while motility changed by 9.44% (p = 0.001). Subinguinal varicocelectomy provided an increase in sperm count of 9.75 million per ml (p = 0.002) and sperm motility by 12.25% (p = 0.001). Inguinal varicocelectomy showed the highest pregnancy rate of 41.48% compared to 26.90% and 26.56% after high ligation and subinguinal varicocelectomy, respectively, and the difference was statistically significant (p = 0.035). This meta-analysis suggests that varicocelectomy leads to significant improvements in sperm count and motility regardless of surgical technique, with the inguinal approach offering the highest pregnancy rate. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Leonard, Kevin Raymond
This dissertation concentrates on the development of two new tomographic techniques that enable wide-area inspection of pipe-like structures. By envisioning a pipe as a plate wrapped around upon itself, the previous Lamb Wave Tomography (LWT) techniques are adapted to cylindrical structures. Helical Ultrasound Tomography (HUT) uses Lamb-like guided wave modes transmitted and received by two circumferential arrays in a single crosshole geometry. Meridional Ultrasound Tomography (MUT) creates the same crosshole geometry with a linear array of transducers along the axis of the cylinder. However, even though these new scanning geometries are similar to plates, additional complexities arise because they are cylindrical structures. First, because it is a single crosshole geometry, the wave vector coverage is poorer than in the full LWT system. Second, since waves can travel in both directions around the circumference of the pipe, modes can also constructively and destructively interfere with each other. These complexities necessitate improved signal processing algorithms to produce accurate and unambiguous tomographic reconstructions. Consequently, this work also describes a new algorithm for improving the extraction of multi-mode arrivals from guided wave signals. Previous work has relied solely on the first arriving mode for the time-of-flight measurements. In order to improve the LWT, HUT and MUT systems reconstructions, improved signal processing methods are needed to extract information about the arrival times of the later arriving modes. Because each mode has different through-thickness displacement values, they are sensitive to different types of flaws, and the information gained from the multi-mode analysis improves understanding of the structural integrity of the inspected material. Both tomographic frequency compounding and mode sorting algorithms are introduced. It is also shown that each of these methods improve the reconstructed images both qualitatively and quantitatively.
NASA Astrophysics Data System (ADS)
Cecinati, F.; Wani, O.; Rico-Ramirez, M. A.
2017-11-01
Merging radar and rain gauge rainfall data is a technique used to improve the quality of spatial rainfall estimates and in particular the use of Kriging with External Drift (KED) is a very effective radar-rain gauge rainfall merging technique. However, kriging interpolations assume Gaussianity of the process. Rainfall has a strongly skewed, positive, probability distribution, characterized by a discontinuity due to intermittency. In KED rainfall residuals are used, implicitly calculated as the difference between rain gauge data and a linear function of the radar estimates. Rainfall residuals are non-Gaussian as well. The aim of this work is to evaluate the impact of applying KED to non-Gaussian rainfall residuals, and to assess the best techniques to improve Gaussianity. We compare Box-Cox transformations with λ parameters equal to 0.5, 0.25, and 0.1, Box-Cox with time-variant optimization of λ, normal score transformation, and a singularity analysis technique. The results suggest that Box-Cox with λ = 0.1 and the singularity analysis is not suitable for KED. Normal score transformation and Box-Cox with optimized λ, or λ = 0.25 produce satisfactory results in terms of Gaussianity of the residuals, probability distribution of the merged rainfall products, and rainfall estimate quality, when validated through cross-validation. However, it is observed that Box-Cox transformations are strongly dependent on the temporal and spatial variability of rainfall and on the units used for the rainfall intensity. Overall, applying transformations results in a quantitative improvement of the rainfall estimates only if the correct transformations for the specific data set are used.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Rice, Mark J.
Contingency analysis studies are necessary to assess the impact of possible power system component failures. The results of the contingency analysis are used to ensure the grid reliability, and in power market operation for the feasibility test of market solutions. Currently, these studies are performed in real time based on the current operating conditions of the grid with a set of pre-selected contingency list, which might result in overlooking some critical contingencies caused by variable system status. To have a complete picture of a power grid, more contingencies need to be studied to improve grid reliability. High-performance computing techniques holdmore » the promise of being able to perform the analysis for more contingency cases within a much shorter time frame. This paper evaluates the performance of counter-based dynamic load balancing schemes for a massive contingency analysis program on 10,000+ cores. One million N-2 contingency analysis cases with a Western Electricity Coordinating Council power grid model have been used to demonstrate the performance. The speedup of 3964 with 4096 cores and 7877 with 10240 cores are obtained. This paper reports the performance of the load balancing scheme with a single counter and two counters, describes disk I/O issues, and discusses other potential techniques for further improving the performance.« less
NASA Astrophysics Data System (ADS)
Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.
2018-01-01
A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.
Pattern recognition of satellite cloud imagery for improved weather prediction
NASA Technical Reports Server (NTRS)
Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.
1986-01-01
The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.
A VLF-based technique in applications to digital control of nonlinear hybrid multirate systems
NASA Astrophysics Data System (ADS)
Vassilyev, Stanislav; Ulyanov, Sergey; Maksimkin, Nikolay
2017-01-01
In this paper, a technique for rigorous analysis and design of nonlinear multirate digital control systems on the basis of the reduction method and sublinear vector Lyapunov functions is proposed. The control system model under consideration incorporates continuous-time dynamics of the plant and discrete-time dynamics of the controller and takes into account uncertainties of the plant, bounded disturbances, nonlinear characteristics of sensors and actuators. We consider a class of multirate systems where the control update rate is slower than the measurement sampling rates and periodic non-uniform sampling is admitted. The proposed technique does not use the preliminary discretization of the system, and, hence, allows one to eliminate the errors associated with the discretization and improve the accuracy of analysis. The technique is applied to synthesis of digital controller for a flexible spacecraft in the fine stabilization mode and decentralized controller for a formation of autonomous underwater vehicles. Simulation results are provided to validate the good performance of the designed controllers.
Grossman, R A
1995-09-01
The purpose of this study was to determine whether women can discriminate better from less effective paracervical block techniques applied to opposite sides of the cervix. If this discrimination could be made, it would be possible to compare different techniques and thus improve the quality of paracervical anesthesia. Two milliliters of local anesthetic was applied to one side and 6 ml to the other side of volunteers' cervices before cervical dilation. Statistical examination was by sequential analysis. The study was stopped after 47 subjects had entered, when sequential analysis found that there was no significant difference in women's perception of pain. Nine women reported more pain on the side with more anesthesia and eight reported more pain on the side with less anesthesia. Because the amount of anesthesia did not make a difference, the null hypothesis (that women cannot discriminate between different anesthetic techniques) was accepted. Women are not able to discriminate different doses of local anesthetic when applied to opposite sides of the cervix.
Updating Landsat-derived land-cover maps using change detection and masking techniques
NASA Technical Reports Server (NTRS)
Likens, W.; Maw, K.
1982-01-01
The California Integrated Remote Sensing System's San Bernardino County Project was devised to study the utilization of a data base at a number of jurisdictional levels. The present paper discusses the implementation of change-detection and masking techniques in the updating of Landsat-derived land-cover maps. A baseline landcover classification was first created from a 1976 image, then the adjusted 1976 image was compared with a 1979 scene by the techniques of (1) multidate image classification, (2) difference image-distribution tails thresholding, (3) difference image classification, and (4) multi-dimensional chi-square analysis of a difference image. The union of the results of methods 1, 3 and 4 was used to create a mask of possible change areas between 1976 and 1979, which served to limit analysis of the update image and reduce comparison errors in unchanged areas. The techniques of spatial smoothing of change-detection products, and of combining results of difference change-detection algorithms are also shown to improve Landsat change-detection accuracies.
Accelerometer-based on-body sensor localization for health and medical monitoring applications
Vahdatpour, Alireza; Amini, Navid; Xu, Wenyao; Sarrafzadeh, Majid
2011-01-01
In this paper, we present a technique to recognize the position of sensors on the human body. Automatic on-body device localization ensures correctness and accuracy of measurements in health and medical monitoring systems. In addition, it provides opportunities to improve the performance and usability of ubiquitous devices. Our technique uses accelerometers to capture motion data to estimate the location of the device on the user’s body, using mixed supervised and unsupervised time series analysis methods. We have evaluated our technique with extensive experiments on 25 subjects. On average, our technique achieves 89% accuracy in estimating the location of devices on the body. In order to study the feasibility of classification of left limbs from right limbs (e.g., left arm vs. right arm), we performed analysis, based of which no meaningful classification was observed. Personalized ultraviolet monitoring and wireless transmission power control comprise two immediate applications of our on-body device localization approach. Such applications, along with their corresponding feasibility studies, are discussed. PMID:22347840
Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models
NASA Astrophysics Data System (ADS)
Altuntas, Alper; Baugh, John
2017-07-01
Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.
Multivariate Bias Correction Procedures for Improving Water Quality Predictions from the SWAT Model
NASA Astrophysics Data System (ADS)
Arumugam, S.; Libera, D.
2017-12-01
Water quality observations are usually not available on a continuous basis for longer than 1-2 years at a time over a decadal period given the labor requirements making calibrating and validating mechanistic models difficult. Further, any physical model predictions inherently have bias (i.e., under/over estimation) and require post-simulation techniques to preserve the long-term mean monthly attributes. This study suggests a multivariate bias-correction technique and compares to a common technique in improving the performance of the SWAT model in predicting daily streamflow and TN loads across the southeast based on split-sample validation. The approach is a dimension reduction technique, canonical correlation analysis (CCA) that regresses the observed multivariate attributes with the SWAT model simulated values. The common approach is a regression based technique that uses an ordinary least squares regression to adjust model values. The observed cross-correlation between loadings and streamflow is better preserved when using canonical correlation while simultaneously reducing individual biases. Additionally, canonical correlation analysis does a better job in preserving the observed joint likelihood of observed streamflow and loadings. These procedures were applied to 3 watersheds chosen from the Water Quality Network in the Southeast Region; specifically, watersheds with sufficiently large drainage areas and number of observed data points. The performance of these two approaches are compared for the observed period and over a multi-decadal period using loading estimates from the USGS LOADEST model. Lastly, the CCA technique is applied in a forecasting sense by using 1-month ahead forecasts of P & T from ECHAM4.5 as forcings in the SWAT model. Skill in using the SWAT model for forecasting loadings and streamflow at the monthly and seasonal timescale is also discussed.
Automated image analysis of alpha-particle autoradiographs of human bone
NASA Astrophysics Data System (ADS)
Hatzialekou, Urania; Henshaw, Denis L.; Fews, A. Peter
1988-01-01
Further techniques [4,5] for the analysis of CR-39 α-particle autoradiographs have been developed for application to α-autoradiography of autopsy bone at natural levels for exposure. The most significant new approach is the use of fully automated image analysis using a system developed in this laboratory. A 5 cm × 5 cm autoradiograph of tissue in which the activity is below 1 Bq kg -1 is scanned to both locate and measure the recorded α-particle tracks at a rate of 5 cm 2/h. Improved methods of calibration have also been developed. The techniques are described and in order to illustrate their application, a bone sample contaminated with 239Pu is analysed. Results from natural levels are the subject of a separate publication.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
The effect of two mobilization techniques on dorsiflexion in people with chronic ankle instability.
Marrón-Gómez, David; Rodríguez-Fernández, Ángel L; Martín-Urrialde, José A
2015-02-01
To compare the effect of two manual therapy techniques, mobilization with movement (WB-MWM) and talocrural manipulation (HVLA), for the improvement of ankle dorsiflexion in people with chronic ankle instability (CAI) over 48 h. Randomized controlled clinical trial. University research laboratory. Fifty-two participants (mean ± SD age, 20.7 ± 3.4 years) with CAI were randomized to WB-MWM (n = 18), HVLA (n = 19) or placebo group (n = 15). Weight-bearing ankle dorsiflexion measured with the weight-bearing lunge. Measurements were obtained prior to intervention, immediately after intervention, and 10 min, 24 h and 48 h post-intervention. There was a significant effect × time (F4,192 = 20.65; P < 0.001) and a significant time × group interactions (F8,192 = 6.34; P < 0.001). Post hoc analysis showed a significant increase of ankle dorsiflexion in both WB-MWM and HVLA groups with respect to the placebo group with no differences between both active treatment groups. A single application of WB-MWM or HVLA manual technique improves ankle dorsiflexion in people with CAI, and the effects persist for at least two days. Both techniques have similar effectiveness for improving ankle dorsiflexion although WB-MWM demonstrated greater effect sizes. Copyright © 2014 Elsevier Ltd. All rights reserved.
Kinematic and kinetic analysis of overhand, sidearm and underhand lacrosse shot techniques.
Macaulay, Charles A J; Katz, Larry; Stergiou, Pro; Stefanyshyn, Darren; Tomaghelli, Luciano
2017-12-01
Lacrosse requires the coordinated performance of many complex skills. One of these skills is shooting on the opponents' net using one of three techniques: overhand, sidearm or underhand. The purpose of this study was to (i) determine which technique generated the highest ball velocity and greatest shot accuracy and (ii) identify kinematic and kinetic variables that contribute to a high velocity and high accuracy shot. Twelve elite male lacrosse players participated in this study. Kinematic data were sampled at 250 Hz, while two-dimensional force plates collected ground reaction force data (1000 Hz). Statistical analysis showed significantly greater ball velocity for the sidearm technique than overhand (P < 0.001) and underhand (P < 0.001) techniques. No statistical difference was found for shot accuracy (P > 0.05). Kinematic and kinetic variables were not significantly correlated to shot accuracy or velocity across all shot types; however, when analysed independently, the lead foot horizontal impulse showed a negative correlation with underhand ball velocity (P = 0.042). This study identifies the technique with the highest ball velocity, defines kinematic and kinetic predictors related to ball velocity and provides information to coaches and athletes concerned with improving lacrosse shot performance.
NASA Technical Reports Server (NTRS)
Appleby, M. H.; Golightly, M. J.; Hardy, A. C.
1993-01-01
Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.
On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.
Home visit program improves technique survival in peritoneal dialysis.
Martino, Francesca; Adıbelli, Z; Mason, G; Nayak, A; Ariyanon, W; Rettore, E; Crepaldi, Carlo; Rodighiero, Mariapia; Ronco, Claudio
2014-01-01
Peritoneal dialysis (PD) is a home therapy, and technique survival is related to the adherence to PD prescription at home. The presence of a home visit program could improve PD outcomes. We evaluated its effects on clinical outcome during 1 year of follow-up. This was a case-control study. The case group included all 96 patients who performed PD in our center on January 1, 2013, and who attended a home visit program; the control group included all 92 patients who performed PD on January 1, 2008. The home visit program consisted of several additional visits to reinforce patients' confidence in PD management in their own environment. Outcomes were defined as technique failure, peritonitis episode, and hospitalization. Clinical and dialysis features were evaluated for each patient. The case group was significantly older (p = 0.048), with a lower grade of autonomy (p = 0.033), but a better hemoglobin level (p = 0.02) than the control group. During the observational period, we had 11 episodes of technique failure. We found a significant reduction in the rate of technique failure in the case group (p = 0.004). Furthermore, survival analysis showed a significant extension of PD treatment in the patients supported by the home visit program (52 vs. 48.8 weeks, p = 0.018). We did not find any difference between the two groups in terms of peritonitis and hospitalization rate; however, trends toward a reduction of Gram-positive peritonitis rates as well as prevalence and duration of hospitalization related to PD problems were identified in the case group. The retrospective nature of the analysis was a limitation of this study. The home visit program improves the survival of PD patients and could reduce the rate of Gram-positive peritonitis and hospitalization. Video Journal Club "Cappuccino with Claudio Ronco" at http://www.karger.com/?doi=365168.
TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.
2016-01-01
Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804
Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R
2016-09-01
Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.
Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li
2014-12-01
Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.
Transcriptome analysis of stimulated PBMC from Mycobacterium bovis infected cattle
USDA-ARS?s Scientific Manuscript database
Immunological responses of cattle to Mycobacterium bovis (M. bovis) infection are of interest in terms of understanding the biology of M. bovis infection and for the development of improved diagnostic techniques. Although considerable time and resources have been invested in understanding immune re...
Wrestling with Philosophy: Improving Scholarship in Higher Education
ERIC Educational Resources Information Center
Kezar, Adrianna
2004-01-01
Method is usually viewed as completely separate from philosophy or theory, focusing instead on techniques and procedures of interviewing, focus groups, observation, or statistical analysis. Several texts on methodology published recently have added significant sections on philosophy, such as Creswell's (1998) Qualitative inquiry and research…