Sample records for effects analysis technique

  1. Approaches to answering critical CER questions.

    PubMed

    Kinnier, Christine V; Chung, Jeanette W; Bilimoria, Karl Y

    2015-01-01

    While randomized controlled trials (RCTs) are the gold standard for research, many research questions cannot be ethically and practically answered using an RCT. Comparative effectiveness research (CER) techniques are often better suited than RCTs to address the effects of an intervention under routine care conditions, an outcome otherwise known as effectiveness. CER research techniques covered in this section include: effectiveness-oriented experimental studies such as pragmatic trials and cluster randomized trials, treatment response heterogeneity, observational and database studies including adjustment techniques such as sensitivity analysis and propensity score analysis, systematic reviews and meta-analysis, decision analysis, and cost effectiveness analysis. Each section describes the technique and covers the strengths and weaknesses of the approach.

  2. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  3. Directed Incremental Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Yang, Guowei; Rungta, Neha; Khurshid, Sarfraz

    2011-01-01

    The last few years have seen a resurgence of interest in the use of symbolic execution -- a program analysis technique developed more than three decades ago to analyze program execution paths. Scaling symbolic execution and other path-sensitive analysis techniques to large systems remains challenging despite recent algorithmic and technological advances. An alternative to solving the problem of scalability is to reduce the scope of the analysis. One approach that is widely studied in the context of regression analysis is to analyze the differences between two related program versions. While such an approach is intuitive in theory, finding efficient and precise ways to identify program differences, and characterize their effects on how the program executes has proved challenging in practice. In this paper, we present Directed Incremental Symbolic Execution (DiSE), a novel technique for detecting and characterizing the effects of program changes. The novelty of DiSE is to combine the efficiencies of static analysis techniques to compute program difference information with the precision of symbolic execution to explore program execution paths and generate path conditions affected by the differences. DiSE is a complementary technique to other reduction or bounding techniques developed to improve symbolic execution. Furthermore, DiSE does not require analysis results to be carried forward as the software evolves -- only the source code for two related program versions is required. A case-study of our implementation of DiSE illustrates its effectiveness at detecting and characterizing the effects of program changes.

  4. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  5. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less

  6. General methodology: Costing, budgeting, and techniques for benefit-cost and cost-effectiveness analysis

    NASA Technical Reports Server (NTRS)

    Stretchberry, D. M.; Hein, G. F.

    1972-01-01

    The general concepts of costing, budgeting, and benefit-cost ratio and cost-effectiveness analysis are discussed. The three common methods of costing are presented. Budgeting distributions are discussed. The use of discounting procedures is outlined. The benefit-cost ratio and cost-effectiveness analysis is defined and their current application to NASA planning is pointed out. Specific practices and techniques are discussed, and actual costing and budgeting procedures are outlined. The recommended method of calculating benefit-cost ratios is described. A standardized method of cost-effectiveness analysis and long-range planning are also discussed.

  7. Cost considerations in using simulations for medical training.

    PubMed

    Fletcher, J D; Wind, Alexander P

    2013-10-01

    This article reviews simulation used for medical training, techniques for assessing simulation-based training, and cost analyses that can be included in such assessments. Simulation in medical training appears to take four general forms: human actors who are taught to simulate illnesses and ailments in standardized ways; virtual patients who are generally presented via computer-controlled, multimedia displays; full-body manikins that simulate patients using electronic sensors, responders, and controls; and part-task anatomical simulations of various body parts and systems. Techniques for assessing costs include benefit-cost analysis, return on investment, and cost-effectiveness analysis. Techniques for assessing the effectiveness of simulation-based medical training include the use of transfer effectiveness ratios and incremental transfer effectiveness ratios to measure transfer of knowledge and skill provided by simulation to the performance of medical procedures. Assessment of costs and simulation effectiveness can be combined with measures of transfer using techniques such as isoperformance analysis to identify ways of minimizing costs without reducing performance effectiveness or maximizing performance without increasing costs. In sum, economic analysis must be considered in training assessments if training budgets are to compete successfully with other requirements for funding. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  8. The integrated manual and automatic control of complex flight systems

    NASA Technical Reports Server (NTRS)

    Schmidt, D. K.

    1985-01-01

    Pilot/vehicle analysis techniques for optimizing aircraft handling qualities are presented. The analysis approach considered is based on the optimal control frequency domain techniques. These techniques stem from an optimal control approach of a Neal-Smith like analysis on aircraft attitude dynamics extended to analyze the flared landing task. Some modifications to the technique are suggested and discussed. An in depth analysis of the effect of the experimental variables, such as prefilter, is conducted to gain further insight into the flared land task for this class of vehicle dynamics.

  9. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  10. Research to Develop Effective Teaching and Management Techniques for Severely Disturbed and Retarded Children. Final Report.

    ERIC Educational Resources Information Center

    Kauffman, James M.; Birnbrauer, Jay S.

    The final report of a project on teaching and management techniques with severely disturbed and/or retarded children presents analysis of single subject research using contingent imitation of the child as an intervention technique. The effects of this technique were examined on the following behaviors: toyplay and reciprocal imitation, self…

  11. K-Fold Crossvalidation in Canonical Analysis.

    ERIC Educational Resources Information Center

    Liang, Kun-Hsia; And Others

    1995-01-01

    A computer-assisted, K-fold cross-validation technique is discussed in the framework of canonical correlation analysis of randomly generated data sets. Analysis results suggest that this technique can effectively reduce the contamination of canonical variates and canonical correlations by sample-specific variance components. (Author/SLD)

  12. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  13. Occupational Analysis Technology: Expanded Role in Development of Cost-Effective Maintenance Systems. Final Report.

    ERIC Educational Resources Information Center

    Foley, John P., Jr.

    A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…

  14. Phased-mission system analysis using Boolean algebraic methods

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.; Trivedi, Kishor S.

    1993-01-01

    Most reliability analysis techniques and tools assume that a system is used for a mission consisting of a single phase. However, multiple phases are natural in many missions. The failure rates of components, system configuration, and success criteria may vary from phase to phase. In addition, the duration of a phase may be deterministic or random. Recently, several researchers have addressed the problem of reliability analysis of such systems using a variety of methods. A new technique for phased-mission system reliability analysis based on Boolean algebraic methods is described. Our technique is computationally efficient and is applicable to a large class of systems for which the failure criterion in each phase can be expressed as a fault tree (or an equivalent representation). Our technique avoids state space explosion that commonly plague Markov chain-based analysis. A phase algebra to account for the effects of variable configurations and success criteria from phase to phase was developed. Our technique yields exact (as opposed to approximate) results. The use of our technique was demonstrated by means of an example and present numerical results to show the effects of mission phases on the system reliability.

  15. Predicting Effective Course Conduction Strategy Using Datamining Techniques

    ERIC Educational Resources Information Center

    Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.

    2017-01-01

    Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…

  16. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  17. Computer-delivered interventions for reducing alcohol consumption: meta-analysis and meta-regression using behaviour change techniques and theory.

    PubMed

    Black, Nicola; Mullan, Barbara; Sharpe, Louise

    2016-09-01

    The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.

  18. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  19. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  20. Technologies for Clinical Diagnosis Using Expired Human Breath Analysis

    PubMed Central

    Mathew, Thalakkotur Lazar; Pownraj, Prabhahari; Abdulla, Sukhananazerin; Pullithadathil, Biji

    2015-01-01

    This review elucidates the technologies in the field of exhaled breath analysis. Exhaled breath gas analysis offers an inexpensive, noninvasive and rapid method for detecting a large number of compounds under various conditions for health and disease states. There are various techniques to analyze some exhaled breath gases, including spectrometry, gas chromatography and spectroscopy. This review places emphasis on some of the critical biomarkers present in exhaled human breath, and its related effects. Additionally, various medical monitoring techniques used for breath analysis have been discussed. It also includes the current scenario of breath analysis with nanotechnology-oriented techniques. PMID:26854142

  1. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.

    PubMed

    Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao

    2018-01-01

    In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.

  3. Blue dye for identification of sentinel nodes in breast cancer and malignant melanoma: a systematic review and meta-analysis.

    PubMed

    Peek, Mirjam Cl; Charalampoudis, Petros; Anninga, Bauke; Baker, Rose; Douek, Michael

    2017-02-01

    The combined technique (radioisotope and blue dye) is the gold standard for sentinel lymph node biopsy (SLNB) and there is wide variation in techniques and blue dyes used. We performed a systematic review and meta-analysis to assess the need for radioisotope and the optimal blue dye for SLNB. A total of 21 studies were included. The SLNB identification rates are high with all the commonly used blue dyes. Furthermore, methylene blue is superior to iso-sulfan blue and Patent Blue V with respect to false-negative rates. The combined technique remains the most accurate and effective technique for SLNB. In order to standardize the SLNB technique, comparative trials to determine the most effective blue dye and national guidelines are required.

  4. External tissue expansion for difficult wounds using a simple cost effective technique.

    PubMed

    Nandhagopal, Vijayaraghavan; Chittoria, Ravi Kumar; Mohapatra, Devi Prasad; Thiruvoth, Friji Meethale; Sivakumar, Dinesh Kumar; Ashokan, Arjun

    2015-01-01

    To study and discuss role of external tissue expansion and wound closure (ETEWC) technique using hooks and rubber bands. The present study is a retrospective analysis of nine cases of wounds of different aetiology where ETEWC technique was applied using hooks and rubber bands. All the wounds in the study healed completely without split thickness skin graft (SSG) or flap. ETEWC technique using hooks and rubber bands is a cost-effective technique which can be used for wound closure without SSG or flap.

  5. [Development of performance evaluation and management system on advanced schistosomiasis medical treatment].

    PubMed

    Zhou, Xiao-Rong; Huang, Shui-Sheng; Gong, Xin-Guo; Cen, Li-Ping; Zhang, Cong; Zhu, Hong; Yang, Jun-Jing; Chen, Li

    2012-04-01

    To construct a performance evaluation and management system on advanced schistosomiasis medical treatment, and analyze and evaluate the work of the advanced schistosomiasis medical treatment over the years. By applying the database management technique and C++ programming technique, we inputted the information of the advanced schistosomiasis cases into the system, and comprehensively evaluated the work of the advanced schistosomiasis medical treatment through the cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. We made a set of software formula about cost-effect analysis, cost-effectiveness analysis, and cost-benefit analysis. This system had many features such as clear building, easy to operate, friendly surface, convenient information input and information search. It could benefit the performance evaluation of the province's advanced schistosomiasis medical treatment work. This system can satisfy the current needs of advanced schistosomiasis medical treatment work and can be easy to be widely used.

  6. Sensor failure and multivariable control for airbreathing propulsion systems. Ph.D. Thesis - Dec. 1979 Final Report

    NASA Technical Reports Server (NTRS)

    Behbehani, K.

    1980-01-01

    A new sensor/actuator failure analysis technique for turbofan jet engines was developed. Three phases of failure analysis, namely detection, isolation, and accommodation are considered. Failure detection and isolation techniques are developed by utilizing the concept of Generalized Likelihood Ratio (GLR) tests. These techniques are applicable to both time varying and time invariant systems. Three GLR detectors are developed for: (1) hard-over sensor failure; (2) hard-over actuator failure; and (3) brief disturbances in the actuators. The probability distribution of the GLR detectors and the detectability of sensor/actuator failures are established. Failure type is determined by the maximum of the GLR detectors. Failure accommodation is accomplished by extending the Multivariable Nyquest Array (MNA) control design techniques to nonsquare system designs. The performance and effectiveness of the failure analysis technique are studied by applying the technique to a turbofan jet engine, namely the Quiet Clean Short Haul Experimental Engine (QCSEE). Single and multiple sensor/actuator failures in the QCSEE are simulated and analyzed and the effects of model degradation are studied.

  7. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  8. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  9. Ground Vibration Test Planning and Pre-Test Analysis for the X-33 Vehicle

    NASA Technical Reports Server (NTRS)

    Bedrossian, Herand; Tinker, Michael L.; Hidalgo, Homero

    2000-01-01

    This paper describes the results of the modal test planning and the pre-test analysis for the X-33 vehicle. The pre-test analysis included the selection of the target modes, selection of the sensor and shaker locations and the development of an accurate Test Analysis Model (TAM). For target mode selection, four techniques were considered, one based on the Modal Cost technique, one based on Balanced Singular Value technique, a technique known as the Root Sum Squared (RSS) method, and a Modal Kinetic Energy (MKE) approach. For selecting sensor locations, four techniques were also considered; one based on the Weighted Average Kinetic Energy (WAKE), one based on Guyan Reduction (GR), one emphasizing engineering judgment, and one based on an optimum sensor selection technique using Genetic Algorithm (GA) search technique combined with a criteria based on Hankel Singular Values (HSV's). For selecting shaker locations, four techniques were also considered; one based on the Weighted Average Driving Point Residue (WADPR), one based on engineering judgment and accessibility considerations, a frequency response method, and an optimum shaker location selection based on a GA search technique combined with a criteria based on HSV's. To evaluate the effectiveness of the proposed sensor and shaker locations for exciting the target modes, extensive numerical simulations were performed. Multivariate Mode Indicator Function (MMIF) was used to evaluate the effectiveness of each sensor & shaker set with respect to modal parameter identification. Several TAM reduction techniques were considered including, Guyan, IRS, Modal, and Hybrid. Based on a pre-test cross-orthogonality checks using various reduction techniques, a Hybrid TAM reduction technique was selected and was used for all three vehicle fuel level configurations.

  10. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review

    PubMed Central

    Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat

    2017-01-01

    Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688

  11. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  12. Methodology for assessing the effectiveness of access management techniques : executive summary.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  13. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  14. Analytical Modelling of the Effects of Different Gas Turbine Cooling Techniques on Engine Performance =

    NASA Astrophysics Data System (ADS)

    Uysal, Selcuk Can

    In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).

  15. A Structural and Content-Based Analysis for Web Filtering.

    ERIC Educational Resources Information Center

    Lee, P. Y.; Hui, S. C.; Fong, A. C. M.

    2003-01-01

    Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)

  16. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  17. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  18. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  19. The Relationship between Visual Analysis and Five Statistical Analyses in a Simple AB Single-Case Research Design

    ERIC Educational Resources Information Center

    Brossart, Daniel F.; Parker, Richard I.; Olson, Elizabeth A.; Mahadevan, Lakshmi

    2006-01-01

    This study explored some practical issues for single-case researchers who rely on visual analysis of graphed data, but who also may consider supplemental use of promising statistical analysis techniques. The study sought to answer three major questions: (a) What is a typical range of effect sizes from these analytic techniques for data from…

  20. Analytical transmissibility based transfer path analysis for multi-energy-domain systems using four-pole parameter theory

    NASA Astrophysics Data System (ADS)

    Mashayekhi, Mohammad Jalali; Behdinan, Kamran

    2017-10-01

    The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.

  1. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  2. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  3. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  4. Methodology for assessing the effectiveness of access management techniques : final report, September 14, 1998.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  5. The incidence of secondary vertebral fracture of vertebral augmentation techniques versus conservative treatment for painful osteoporotic vertebral fractures: a systematic review and meta-analysis.

    PubMed

    Song, Dawei; Meng, Bin; Gan, Minfeng; Niu, Junjie; Li, Shiyan; Chen, Hao; Yuan, Chenxi; Yang, Huilin

    2015-08-01

    Percutaneous vertebroplasty (PVP) and balloon kyphoplasty (BKP) are minimally invasive and effective vertebral augmentation techniques for managing osteoporotic vertebral compression fractures (OVCFs). Recent meta-analyses have compared the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques or conservative treatment; however, the inclusions were not thorough and rigorous enough, and the effects of each technique on the incidence of secondary vertebral fractures remain unclear. To perform an updated systematic review and meta-analysis of the studies with more rigorous inclusion criteria on the effects of vertebral augmentation techniques and conservative treatment for OVCF on the incidence of secondary vertebral fractures. PubMed, MEDLINE, EMBASE, SpringerLink, Web of Science, and the Cochrane Library database were searched for relevant original articles comparing the incidence of secondary vertebral fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. Randomized controlled trials (RCTs) and prospective non-randomized controlled trials (NRCTs) were identified. The methodological qualities of the studies were evaluated, relevant data were extracted and recorded, and an appropriate meta-analysis was conducted. A total of 13 articles were included. The pooled results from included studies showed no statistically significant differences in the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques and conservative treatment. Subgroup analysis comparing different study designs, durations of symptoms, follow-up times, races of patients, and techniques were conducted, and no significant differences in the incidence of secondary fractures were identified (P > 0.05). No obvious publication bias was detected by either Begg's test (P = 0.360 > 0.05) or Egger's test (P = 0.373 > 0.05). Despite current thinking in the field that vertebral augmentation procedures may increase the incidence of secondary fractures, we found no differences in the incidence of secondary fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. © The Foundation Acta Radiologica 2014.

  6. Portable X-ray fluorescence spectroscopy as a rapid screening technique for analysis of TiO2 and ZnO in sunscreens.

    PubMed

    Bairi, Venu Gopal; Lim, Jin-Hee; Quevedo, Ivan R; Mudalige, Thilak K; Linder, Sean W

    2016-02-01

    This investigation reports a rapid and simple screening technique for the quantification of titanium and zinc in commercial sunscreens using portable X-ray fluorescence spectroscopy (pXRF). A highly evolved technique, inductively coupled plasma-mass spectroscopy (ICP-MS) was chosen as a comparative technique to pXRF, and a good correlation (r 2 > 0.995) with acceptable variations (≤25%) in results between both techniques was observed. Analytical figures of merit such as detection limit, quantitation limit, and linear range of the method are reported for the pXRF technique. This method has a good linearity (r 2 > 0.995) for the analysis of titanium (Ti) in the range of 0.4-14.23 wt%, and zinc (Zn) in the range of 1.0-23.90 wt%. However, most commercial sunscreens contain organic ingredients, and these ingredients are known to cause matrix effects. The development of appropriate matrix matched working standards to obtain the calibration curve was found to be a major challenge for the pXRF measurements. In this study, we have overcome the matrix effect by using metal-free commercial sunscreens as a dispersing media for the preparation of working standards. An easy extension of this unique methodology for preparing working standards in different matrices was also reported. This method is simple, rapid, and cost-effective and, in comparison to conventional techniques (e.g., ICP-MS), did not generate toxic wastes during sample analysis.

  7. Portable X-ray fluorescence spectroscopy as a rapid screening technique for analysis of TiO2 and ZnO in sunscreens

    NASA Astrophysics Data System (ADS)

    Bairi, Venu Gopal; Lim, Jin-Hee; Quevedo, Ivan R.; Mudalige, Thilak K.; Linder, Sean W.

    2016-02-01

    This investigation reports a rapid and simple screening technique for the quantification of titanium and zinc in commercial sunscreens using portable X-ray fluorescence spectroscopy (pXRF). A highly evolved technique, inductively coupled plasma-mass spectroscopy (ICP-MS) was chosen as a comparative technique to pXRF, and a good correlation (r2 > 0.995) with acceptable variations (≤ 25%) in results between both techniques was observed. Analytical figures of merit such as detection limit, quantitation limit, and linear range of the method are reported for the pXRF technique. This method has a good linearity (r2 > 0.995) for the analysis of titanium (Ti) in the range of 0.4-14.23 wt%, and zinc (Zn) in the range of 1.0-23.90 wt%. However, most commercial sunscreens contain organic ingredients, and these ingredients are known to cause matrix effects. The development of appropriate matrix matched working standards to obtain the calibration curve was found to be a major challenge for the pXRF measurements. In this study, we have overcome the matrix effect by using metal-free commercial sunscreens as a dispersing media for the preparation of working standards. An easy extension of this unique methodology for preparing working standards in different matrices was also reported. This method is simple, rapid, and cost-effective and, in comparison to conventional techniques (e.g., ICP-MS), did not generate toxic wastes during sample analysis.

  8. Limit Cycle Analysis Applied to the Oscillations of Decelerating Blunt-Body Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Schoenenberger, Mark; Queen, Eric M.

    2008-01-01

    Many blunt-body entry vehicles have nonlinear dynamic stability characteristics that produce self-limiting oscillations in flight. Several different test techniques can be used to extract dynamic aerodynamic coefficients to predict this oscillatory behavior for planetary entry mission design and analysis. Most of these test techniques impose boundary conditions that alter the oscillatory behavior from that seen in flight. Three sets of test conditions, representing three commonly used test techniques, are presented to highlight these effects. Analytical solutions to the constant-coefficient planar equations-of-motion for each case are developed to show how the same blunt body behaves differently depending on the imposed test conditions. The energy equation is applied to further illustrate the governing dynamics. Then, the mean value theorem is applied to the energy rate equation to find the effective damping for an example blunt body with nonlinear, self-limiting dynamic characteristics. This approach is used to predict constant-energy oscillatory behavior and the equilibrium oscillation amplitudes for the various test conditions. These predictions are verified with planar simulations. The analysis presented provides an overview of dynamic stability test techniques and illustrates the effects of dynamic stability, static aerodynamics and test conditions on observed dynamic motions. It is proposed that these effects may be leveraged to develop new test techniques and refine test matrices in future tests to better define the nonlinear functional forms of blunt body dynamic stability curves.

  9. Data Unfolding with Wiener-SVD Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, W.; Li, X.; Qian, X.

    Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.

  10. Data Unfolding with Wiener-SVD Method

    DOE PAGES

    Tang, W.; Li, X.; Qian, X.; ...

    2017-10-04

    Here, data unfolding is a common analysis technique used in HEP data analysis. Inspired by the deconvolution technique in the digital signal processing, a new unfolding technique based on the SVD technique and the well-known Wiener filter is introduced. The Wiener-SVD unfolding approach achieves the unfolding by maximizing the signal to noise ratios in the effective frequency domain given expectations of signal and noise and is free from regularization parameter. Through a couple examples, the pros and cons of the Wiener-SVD approach as well as the nature of the unfolded results are discussed.

  11. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  12. Sensitivity analysis for direct and indirect effects in the presence of exposure-induced mediator-outcome confounders

    PubMed Central

    Chiba, Yasutaka

    2014-01-01

    Questions of mediation are often of interest in reasoning about mechanisms, and methods have been developed to address these questions. However, these methods make strong assumptions about the absence of confounding. Even if exposure is randomized, there may be mediator-outcome confounding variables. Inference about direct and indirect effects is particularly challenging if these mediator-outcome confounders are affected by the exposure because in this case these effects are not identified irrespective of whether data is available on these exposure-induced mediator-outcome confounders. In this paper, we provide a sensitivity analysis technique for natural direct and indirect effects that is applicable even if there are mediator-outcome confounders affected by the exposure. We give techniques for both the difference and risk ratio scales and compare the technique to other possible approaches. PMID:25580387

  13. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  14. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  15. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  16. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Using cognitive task analysis to develop simulation-based training for medical tasks.

    PubMed

    Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette

    2013-10-01

    Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  18. Rasch Analysis: A Primer for School Psychology Researchers and Practitioners

    ERIC Educational Resources Information Center

    Boone, William J.; Noltemeyer, Amity

    2017-01-01

    In order to progress as a field, school psychology research must be informed by effective measurement techniques. One approach to address the need for careful measurement is Rasch analysis. This technique can (a) facilitate the development of instruments that provide useful data, (b) provide data that can be used confidently for both descriptive…

  19. Is Quality/Effectiveness An Empirically Demonstrable School Attribute? Statistical Aids for Determining Appropriate Levels of Analysis.

    ERIC Educational Resources Information Center

    Griffith, James

    2002-01-01

    Describes and demonstrates analytical techniques used in organizational psychology and contemporary multilevel analysis. Using these analytic techniques, examines the relationship between educational outcomes and the school environment. Finds that at least some indicators might be represented as school-level phenomena. Results imply that the…

  20. Simplified Phased-Mission System Analysis for Systems with Independent Component Repairs

    NASA Technical Reports Server (NTRS)

    Somani, Arun K.

    1996-01-01

    Accurate analysis of reliability of system requires that it accounts for all major variations in system's operation. Most reliability analyses assume that the system configuration, success criteria, and component behavior remain the same. However, multiple phases are natural. We present a new computationally efficient technique for analysis of phased-mission systems where the operational states of a system can be described by combinations of components states (such as fault trees or assertions). Moreover, individual components may be repaired, if failed, as part of system operation but repairs are independent of the system state. For repairable systems Markov analysis techniques are used but they suffer from state space explosion. That limits the size of system that can be analyzed and it is expensive in computation. We avoid the state space explosion. The phase algebra is used to account for the effects of variable configurations, repairs, and success criteria from phase to phase. Our technique yields exact (as opposed to approximate) results. We demonstrate our technique by means of several examples and present numerical results to show the effects of phases and repairs on the system reliability/availability.

  1. Analysis and correction of ground reflection effects in measured narrowband sound spectra using cepstral techniques

    NASA Technical Reports Server (NTRS)

    Miles, J. H.; Stevens, G. H.; Leininger, G. G.

    1975-01-01

    Ground reflections generate undesirable effects on acoustic measurements such as those conducted outdoors for jet noise research, aircraft certification, and motor vehicle regulation. Cepstral techniques developed in speech processing are adapted to identify echo delay time and to correct for ground reflection effects. A sample result is presented using an actual narrowband sound pressure level spectrum. The technique can readily be adapted to existing fast Fourier transform type spectrum measurement instrumentation to provide field measurements/of echo time delays.

  2. Language Sample Analysis and Elicitation Technique Effects in Bilingual Children With and Without Language Impairment.

    PubMed

    Kapantzoglou, Maria; Fergadiotis, Gerasimos; Restrepo, M Adelaida

    2017-10-17

    This study examined whether the language sample elicitation technique (i.e., storytelling and story-retelling tasks with pictorial support) affects lexical diversity (D), grammaticality (grammatical errors per communication unit [GE/CU]), sentence length (mean length of utterance in words [MLUw]), and sentence complexity (subordination index [SI]), which are commonly used indices for diagnosing primary language impairment in Spanish-English-speaking children in the United States. Twenty bilingual Spanish-English-speaking children with typical language development and 20 with primary language impairment participated in the study. Four analyses of variance were conducted to evaluate the effect of language elicitation technique and group on D, GE/CU, MLUw, and SI. Also, 2 discriminant analyses were conducted to assess which indices were more effective for story retelling and storytelling and their classification accuracy across elicitation techniques. D, MLUw, and SI were influenced by the type of elicitation technique, but GE/CU was not. The classification accuracy of language sample analysis was greater in story retelling than in storytelling, with GE/CU and D being useful indicators of language abilities in story retelling and GE/CU and SI in storytelling. Two indices in language sample analysis may be sufficient for diagnosis in 4- to 5-year-old bilingual Spanish-English-speaking children.

  3. "Soft"or "hard" ionisation? Investigation of metastable gas temperature effect on direct analysis in real-time analysis of Voriconazole.

    PubMed

    Lapthorn, Cris; Pullen, Frank

    2009-01-01

    The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.

  4. Meta-Analysis of Interactive Video Instruction: A 10 Year Review of Achievement Effects.

    ERIC Educational Resources Information Center

    McNeil, Barbara J.; Nelson, Karyn R.

    Sixty-three studies which investigated cognitive achievement effects following interactive video (IV) instruction were integrated through a meta-analysis technique. Overall, mean achievement effect for IV was .530 (corrected for outliers), indicating that IV is an effective form of instruction. The effect is similar to that of computer-assisted…

  5. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  6. Comparison of composite rotor blade models: A coupled-beam analysis and an MSC/NASTRAN finite-element model

    NASA Technical Reports Server (NTRS)

    Hodges, Robert V.; Nixon, Mark W.; Rehfield, Lawrence W.

    1987-01-01

    A methodology was developed for the structural analysis of composite rotor blades. This coupled-beam analysis is relatively simple to use compared with alternative analysis techniques. The beam analysis was developed for thin-wall single-cell rotor structures and includes the effects of elastic coupling. This paper demonstrates the effectiveness of the new composite-beam analysis method through comparison of its results with those of an established baseline analysis technique. The baseline analysis is an MSC/NASTRAN finite-element model built up from anisotropic shell elements. Deformations are compared for three linear static load cases of centrifugal force at design rotor speed, applied torque, and lift for an ideal rotor in hover. A D-spar designed to twist under axial loading is the subject of the analysis. Results indicate the coupled-beam analysis is well within engineering accuracy.

  7. NUMERICAL ANALYSIS TECHNIQUE USING THE STATISTICAL ENERGY ANALYSIS METHOD CONCERNING THE BLASTING NOISE REDUCTION BY THE SOUND INSULATION DOOR USED IN TUNNEL CONSTRUCTIONS

    NASA Astrophysics Data System (ADS)

    Ishida, Shigeki; Mori, Atsuo; Shinji, Masato

    The main method to reduce the blasting charge noise which occurs in a tunnel under construction is to install the sound insulation door in the tunnel. However, the numerical analysis technique to predict the accurate effect of the transmission loss in the sound insulation door is not established. In this study, we measured the blasting charge noise and the vibration of the sound insulation door in the tunnel with the blasting charge, and performed analysis and modified acoustic feature. In addition, we reproduced the noise reduction effect of the sound insulation door by statistical energy analysis method and confirmed that numerical simulation is possible by this procedure.

  8. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    PubMed Central

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  9. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments.

    PubMed

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto

    2013-04-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  10. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming

    PubMed Central

    van Elk, Michiel; Matzke, Dora; Gronau, Quentin F.; Guan, Maime; Vandekerckhove, Joachim; Wagenmakers, Eric-Jan

    2015-01-01

    According to a recent meta-analysis, religious priming has a positive effect on prosocial behavior (Shariff et al., 2015). We first argue that this meta-analysis suffers from a number of methodological shortcomings that limit the conclusions that can be drawn about the potential benefits of religious priming. Next we present a re-analysis of the religious priming data using two different meta-analytic techniques. A Precision-Effect Testing–Precision-Effect-Estimate with Standard Error (PET-PEESE) meta-analysis suggests that the effect of religious priming is driven solely by publication bias. In contrast, an analysis using Bayesian bias correction suggests the presence of a religious priming effect, even after controlling for publication bias. These contradictory statistical results demonstrate that meta-analytic techniques alone may not be sufficiently robust to firmly establish the presence or absence of an effect. We argue that a conclusive resolution of the debate about the effect of religious priming on prosocial behavior – and about theoretically disputed effects more generally – requires a large-scale, preregistered replication project, which we consider to be the sole remedy for the adverse effects of experimenter bias and publication bias. PMID:26441741

  11. Randomised controlled trial of Alexander technique lessons, exercise, and massage (ATEAM) for chronic and recurrent back pain: economic evaluation.

    PubMed

    Hollinghurst, Sandra; Sharp, Debbie; Ballard, Kathleen; Barnett, Jane; Beattie, Angela; Evans, Maggie; Lewith, George; Middleton, Karen; Oxford, Frances; Webley, Fran; Little, Paul

    2008-12-11

    An economic evaluation of therapeutic massage, exercise, and lessons in the Alexander technique for treating persistent back pain. Cost consequences study and cost effectiveness analysis at 12 month follow-up of a factorial randomised controlled trial. 579 patients with chronic or recurrent low back pain recruited from primary care. Normal care (control), massage, and six or 24 lessons in the Alexander technique. Half of each group were randomised to a prescription for exercise from a doctor plus behavioural counselling from a nurse. Costs to the NHS and to participants. Comparison of costs with Roland-Morris disability score (number of activities impaired by pain), days in pain, and quality adjusted life years (QALYs). Comparison of NHS costs with QALY gain, using incremental cost effectiveness ratios and cost effectiveness acceptability curves. Intervention costs ranged from pound30 for exercise prescription to pound596 for 24 lessons in Alexander technique plus exercise. Cost of health services ranged from pound50 for 24 lessons in Alexander technique to pound124 for exercise. Incremental cost effectiveness analysis of single therapies showed that exercise offered best value ( pound61 per point on disability score, pound9 per additional pain-free day, pound2847 per QALY gain). For two-stage therapy, six lessons in Alexander technique combined with exercise was the best value (additional pound64 per point on disability score, pound43 per additional pain-free day, pound5332 per QALY gain). An exercise prescription and six lessons in Alexander technique alone were both more than 85% likely to be cost effective at values above pound20 000 per QALY, but the Alexander technique performed better than exercise on the full range of outcomes. A combination of six lessons in Alexander technique lessons followed by exercise was the most effective and cost effective option.

  12. Re-Analysis of Data on the Space Radiation Environment above South-East Asia

    DTIC Science & Technology

    1989-11-01

    the cosmic ray flux with geomagnetic latitude, and also show expected increases due to the South Atlantic Anomaly (SAA) and outer belt electrons. How...TECHNIQUES USED 8 3.1 Orbital analysis 8 3.2 Analysis of cosmic ray effects 9 3.3 Analysis of trapped particle effects 11 3.4 Geomagnetic and...magnetospheric activity 12 4 UNCERTAINTIES AND SOURCES OF ERROR 13 4.1 Orbital analysis 13 4.2 Analysis of cosmic ray effects 13 4.3 Analysis of trapped particle

  13. Something old, something new, something borrowed, something blue: a framework for the marriage of health econometrics and cost-effectiveness analysis.

    PubMed

    Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R

    2002-07-01

    Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.

  14. Aromatherapy hand massage for older adults with chronic pain living in long-term care.

    PubMed

    Cino, Kathleen

    2014-12-01

    Older adults living in long-term care experience high rates of chronic pain. Concerns with pharmacologic management have spurred alternative approaches. The purpose of this study was to examine a nursing intervention for older adults with chronic pain. This prospective, randomized control trial compared the effect of aromatherapy M technique hand massage, M technique without aromatherapy, and nurse presence on chronic pain. Chronic pain was measured with the Geriatric Multidimensional Pain and Illness Inventory factors, pain and suffering, life interference, and emotional distress and the Iowa Pain Thermometer, a pain intensity scale. Three groups of 39 to 40 participants recruited from seven long-term care facilities participated twice weekly for 4 weeks. Analysis included multivariate analysis of variance and analysis of variance. Participants experienced decreased levels of chronic pain intensity. Group membership had a significant effect on the Geriatric Multidimensional Pain Inventory Pain and Suffering scores; Iowa Pain Thermometer scores differed significantly within groups. M technique hand massage with or without aromatherapy significantly decreased chronic pain intensity compared to nurse presence visits. M technique hand massage is a safe, simple, but effective intervention. Caregivers using it could improve chronic pain management in this population. © The Author(s) 2014.

  15. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Processing infrared images of aircraft lapjoints

    NASA Technical Reports Server (NTRS)

    Syed, Hazari; Winfree, William P.; Cramer, K. E.

    1992-01-01

    Techniques for processing IR images of aging aircraft lapjoint data are discussed. Attention is given to a technique for detecting disbonds in aircraft lapjoints which clearly delineates the disbonded region from the bonded regions. The technique is weak on unpainted aircraft skin surfaces, but can be overridden by using a self-adhering contact sheet. Neural network analysis on raw temperature data has been shown to be an effective tool for visualization of images. Numerical simulation results show the above processing technique to be an effective tool in delineating the disbonds.

  17. Monitoring Air Quality with Leaf Yeasts.

    ERIC Educational Resources Information Center

    Richardson, D. H. S.; And Others

    1985-01-01

    Proposes that leaf yeast serve as quick, inexpensive, and effective techniques for monitoring air quality. Outlines procedures and provides suggestions for data analysis. Includes results from sample school groups who employed this technique. (ML)

  18. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  19. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  20. A Meta-Analysis of the Effectiveness of Alternative Assessment Techniques

    ERIC Educational Resources Information Center

    Gozuyesil, Eda; Tanriseven, Isil

    2017-01-01

    Purpose: Recent trends have encouraged the use of alternative assessment tools in class in line with the recommendations made by the updated curricula. It is of great importance to understand how alternative assessment affects students' academic outcomes and which techniques are most effective in which contexts. This study aims to examine the…

  1. A combination of selected mapping and clipping to increase energy efficiency of OFDM systems

    PubMed Central

    Lee, Byung Moo; Rim, You Seung

    2017-01-01

    We propose an energy efficient combination design for OFDM systems based on selected mapping (SLM) and clipping peak-to-average power ratio (PAPR) reduction techniques, and show the related energy efficiency (EE) performance analysis. The combination of two different PAPR reduction techniques can provide a significant benefit in increasing EE, because it can take advantages of both techniques. For the combination, we choose the clipping and SLM techniques, since the former technique is quite simple and effective, and the latter technique does not cause any signal distortion. We provide the structure and the systematic operating method, and show the various analyzes to derive the EE gain based on the combined technique. Our analysis show that the combined technique increases the EE by 69% compared to no PAPR reduction, and by 19.34% compared to only using SLM technique. PMID:29023591

  2. GLO-STIX: Graph-Level Operations for Specifying Techniques and Interactive eXploration

    PubMed Central

    Stolper, Charles D.; Kahng, Minsuk; Lin, Zhiyuan; Foerster, Florian; Goel, Aakash; Stasko, John; Chau, Duen Horng

    2015-01-01

    The field of graph visualization has produced a wealth of visualization techniques for accomplishing a variety of analysis tasks. Therefore analysts often rely on a suite of different techniques, and visual graph analysis application builders strive to provide this breadth of techniques. To provide a holistic model for specifying network visualization techniques (as opposed to considering each technique in isolation) we present the Graph-Level Operations (GLO) model. We describe a method for identifying GLOs and apply it to identify five classes of GLOs, which can be flexibly combined to re-create six canonical graph visualization techniques. We discuss advantages of the GLO model, including potentially discovering new, effective network visualization techniques and easing the engineering challenges of building multi-technique graph visualization applications. Finally, we implement the GLOs that we identified into the GLO-STIX prototype system that enables an analyst to interactively explore a graph by applying GLOs. PMID:26005315

  3. Analysis of the Impact of Creative Technique on the Motivation of Physical Education Students in Dance Content: Gender Differences

    ERIC Educational Resources Information Center

    Amado, Diana; Del Villar, Fernando; Sánchez-Miguel, Pedro Antonio; Leo, Francisco Miguel; García-Calvo, Tomás

    2016-01-01

    The aim of this study was to learn about the effectiveness of two dance teaching techniques, the creative examination technique and the direct instruction technique, on the satisfaction of basic psychological needs, the level of self-determination, the perception of usefulness, enjoyment and effort of physical education students. Likewise, it…

  4. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  5. Comparative evaluation of diode laser ablation and surgical stripping technique for gingival depigmentation: A clinical and immunohistochemical study.

    PubMed

    Bakutra, Gaurav; Shankarapillai, Rajesh; Mathur, Lalit; Manohar, Balaji

    2017-01-01

    There are various treatment modalities to remove the black patches of melanin pigmentation. The aim of the study is to clinically compare the diode laser ablation and surgical stripping technique for gingival depigmentation and to evaluate their effect on the histological changes in melanocyte activity. A total of 40 sites of 20 patients with bilateral melanin hyperpigmentation were treated with the surgical stripping and diode laser ablation technique. Change in Hedin index score, change in area of pigmentation using image analyzing software, pain perception, patient preference of treatment were recorded. All 40 sites were selected for immunohistochemical analysis using HMB-45 immunohistochemical marker. At 12 months post-operative visit, in all sites, repigmentation was observed with different grades of Hedin index. Paired t -test, analysis of variance, and Chi-square tests were used for statistical analysis. Repigmentation in surgical stripping is significantly lesser compared to laser ablation. Lesser numbers of melanocytes were found on immunohistological examination at 12 months postoperatively. Comparison for patient preference and pain indices give statistically significant values for diode laser techniques. Gingival hyperpigmentation is effectively managed by diode laser ablation technique and surgical stripping method. In this study, surgical stripping technique found to be better compared to diode laser ablation.

  6. Comparative evaluation of diode laser ablation and surgical stripping technique for gingival depigmentation: A clinical and immunohistochemical study

    PubMed Central

    Bakutra, Gaurav; Shankarapillai, Rajesh; Mathur, Lalit; Manohar, Balaji

    2017-01-01

    Introduction: There are various treatment modalities to remove the black patches of melanin pigmentation. The aim of the study is to clinically compare the diode laser ablation and surgical stripping technique for gingival depigmentation and to evaluate their effect on the histological changes in melanocyte activity. Materials and Methods: A total of 40 sites of 20 patients with bilateral melanin hyperpigmentation were treated with the surgical stripping and diode laser ablation technique. Change in Hedin index score, change in area of pigmentation using image analyzing software, pain perception, patient preference of treatment were recorded. All 40 sites were selected for immunohistochemical analysis using HMB-45 immunohistochemical marker. Results: At 12 months post-operative visit, in all sites, repigmentation was observed with different grades of Hedin index. Paired t-test, analysis of variance, and Chi-square tests were used for statistical analysis. Repigmentation in surgical stripping is significantly lesser compared to laser ablation. Lesser numbers of melanocytes were found on immunohistological examination at 12 months postoperatively. Comparison for patient preference and pain indices give statistically significant values for diode laser techniques. Conclusion: Gingival hyperpigmentation is effectively managed by diode laser ablation technique and surgical stripping method. In this study, surgical stripping technique found to be better compared to diode laser ablation. PMID:28539864

  7. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  8. Practical semen analysis: from A to Z

    PubMed Central

    Brazil, Charlene

    2010-01-01

    Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076

  9. TH-EF-BRC-03: Fault Tree Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomadsen, B.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  10. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  11. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  12. Noise analysis of nucleate boiling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. D.; Ram, K. S.

    1971-01-01

    The techniques of noise analysis have been utilized to investigate nucleate pool boiling. A simple experimental setup has been developed for obtaining the power spectrum of a nucleate boiling system. These techniques were first used to study single bubbles, and a method of relating the two-dimensional projected size and the local velocity of the bubbles to the auto-correlation functions is presented. This method is much less time consuming than conventional methods of measurement and has no probes to disturb the system. These techniques can be used to determine the contribution of evaporation to total heat flux in nucleate boiling. Also, these techniques can be used to investigate the effect of various parameters upon the frequency response of nucleate boiling. The predominant frequencies of the power spectrum correspond to the frequencies of bubble generation. The effects of heat input, degree of subcooling, and liquid surface tension upon the power spectra of a boiling system are presented. It was found that the degree of subcooling has a more pronounced effect upon bubble size than does heat flux. Also the effect of lowering surface tension can be sufficient to reduce the effect of the degree of subcooling upon the size of the bubbles.

  13. Combinations of techniques that effectively change health behavior: evidence from Meta-CART analysis.

    PubMed

    Dusseldorp, Elise; van Genugten, Lenneke; van Buuren, Stef; Verheijden, Marieke W; van Empelen, Pepijn

    2014-12-01

    Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic effects. This study overcomes some of the shortcomings in the current methodology by applying classification and regression trees (CART) to meta-analytic data in a special way, referred to as Meta-CART. The aim was to identify particular combinations of BCTs that explain intervention success. A reanalysis of data from Michie, Abraham, Whittington, McAteer, and Gupta (2009) was performed. These data included effect sizes from 122 interventions targeted at physical activity and healthy eating, and the coding of the interventions into 26 BCTs. A CART analysis was performed using the BCTs as predictors and treatment success (i.e., effect size) as outcome. A subgroup meta-analysis using a mixed effects model was performed to compare the treatment effect in the subgroups found by CART. Meta-CART identified the following most effective combinations: Provide information about behavior-health link with Prompt intention formation (mean effect size ḡ = 0.46), and Provide information about behavior-health link with Provide information on consequences and Use of follow-up prompts (ḡ = 0.44). Least effective interventions were those using Provide feedback on performance without using Provide instruction (ḡ = 0.05). Specific combinations of BCTs increase the likelihood of achieving change in health behavior, whereas other combinations decrease this likelihood. Meta-CART successfully identified these combinations and thus provides a viable methodology in the context of meta-analysis.

  14. The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre

    NASA Astrophysics Data System (ADS)

    Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.

    2005-09-01

    To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.

  15. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

  16. Assessing the Effectiveness of Statistical Classification Techniques in Predicting Future Employment of Participants in the Temporary Assistance for Needy Families Program

    ERIC Educational Resources Information Center

    Montoya, Isaac D.

    2008-01-01

    Three classification techniques (Chi-square Automatic Interaction Detection [CHAID], Classification and Regression Tree [CART], and discriminant analysis) were tested to determine their accuracy in predicting Temporary Assistance for Needy Families program recipients' future employment. Technique evaluation was based on proportion of correctly…

  17. Signal-to-noise ratio analysis and evaluation of the Hadamard imaging technique

    NASA Technical Reports Server (NTRS)

    Jobson, D. J.; Katzberg, S. J.; Spiers, R. B., Jr.

    1977-01-01

    The signal-to-noise ratio performance of the Hadamard imaging technique is analyzed and an experimental evaluation of a laboratory Hadamard imager is presented. A comparison between the performances of Hadamard and conventional imaging techniques shows that the Hadamard technique is superior only when the imaging objective lens is required to have an effective F (focus) number of about 2 or slower.

  18. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  19. Energy resolution improvement of CdTe detectors by using the principal component analysis technique

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2018-02-01

    In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.

  20. Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.

    ERIC Educational Resources Information Center

    Lampe, Marc

    1997-01-01

    Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)

  1. Simulation of the visual effects of power plant plumes

    Treesearch

    Evelyn F. Treiman; David B. Champion; Mona J. Wecksung; Glenn H. Moore; Andrew Ford; Michael D. Williams

    1979-01-01

    The Los Alamos Scientific Laboratory has developed a computer-assisted technique that can predict the visibility effects of potential energy sources in advance of their construction. This technique has been employed in an economic and environmental analysis comparing a single 3000 MW coal-fired power plant with six 500 MW coal-fired power plants located at hypothetical...

  2. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  3. Spectroscopic vector analysis for fast pattern quality monitoring

    NASA Astrophysics Data System (ADS)

    Sohn, Younghoon; Ryu, Sungyoon; Lee, Chihoon; Yang, Yusin

    2018-03-01

    In semiconductor industry, fast and effective measurement of pattern variation has been key challenge for assuring massproduct quality. Pattern measurement techniques such as conventional CD-SEMs or Optical CDs have been extensively used, but these techniques are increasingly limited in terms of measurement throughput and time spent in modeling. In this paper we propose time effective pattern monitoring method through the direct spectrum-based approach. In this technique, a wavelength band sensitive to a specific pattern change is selected from spectroscopic ellipsometry signal scattered by pattern to be measured, and the amplitude and phase variation in the wavelength band are analyzed as a measurement index of the pattern change. This pattern change measurement technique is applied to several process steps and verified its applicability. Due to its fast and simple analysis, the methods can be adapted to the massive process variation monitoring maximizing measurement throughput.

  4. Histogram analysis for smartphone-based rapid hematocrit determination

    PubMed Central

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  5. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  6. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  7. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  9. LSI/VLSI design for testability analysis and general approach

    NASA Technical Reports Server (NTRS)

    Lam, A. Y.

    1982-01-01

    The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.

  10. Is There a Cosmetic Advantage to Single-Incision Laparoscopic Surgical Techniques Over Standard Laparoscopic Surgery? A Systematic Review and Meta-analysis.

    PubMed

    Evans, Luke; Manley, Kate

    2016-06-01

    Single-incision laparoscopic surgery represents an evolution of minimally invasive techniques, but has been a controversial development. A cosmetic advantage is stated by many authors, but has not been found to be universally present or even of considerable importance by patients. This systematic review and meta-analysis demonstrates that there is a cosmetic advantage of the technique regardless of the operation type. The treatment effect in terms of cosmetic improvement is of the order of 0.63.

  11. Reliable screening of various foodstuffs with respect to their irradiation status: A comparative study of different analytical techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Kwak, Ji-Young; Jeong, Mi-Seon; Kwon, Joong-Ho

    2013-10-01

    Cost-effective and time-efficient analytical techniques are required to screen large food lots in accordance to their irradiation status. Gamma-irradiated (0-10 kGy) cinnamon, red pepper, black pepper, and fresh paprika were investigated using photostimulated luminescence (PSL), direct epifluorescent filter technique/the aerobic plate count (DEFT/APC), and electronic-nose (e-nose) analyses. The screening results were also confirmed with thermoluminescence analysis. PSL analysis discriminated between irradiated (positive, >5000 PCs) and non-irradiated (negative, <700 PCs) cinnamon and red peppers. Black pepper had intermediate results (700-5000 PCs), while paprika had low sensitivity (negative results) upon irradiation. The DEFT/APC technique also showed clear screening results through the changes in microbial profiles, where the best results were found in paprika, followed by red pepper and cinnamon. E-nose analysis showed a dose-dependent discrimination in volatile profiles upon irradiation through principal component analysis. These methods can be used considering their potential applications for the screening analysis of irradiated foods.

  12. A comparative study of neutron activation analysis and proton-induced X-ray emission analysis for the determination of heavy metals in estuarine sediments

    NASA Astrophysics Data System (ADS)

    Randle, K.; Al-Jundi, J.; Mamas, C. J. V.; Sokhi, R. S.; Earwaker, L. G.

    1993-06-01

    Our work on heavy metals in the estuarine environment has involved the use of two multielement techniques: neutron activation analysis (NAA) and proton-induced X-ray emission (PIXE) analysis. As PIXE is essentially a surface analytical technique problems may arise due to sample inhomogeneity and surface roughness. In order to assess the contribution of these effects we have compared the results from PIXE analysis with those from a technique which analyzes a larger bulk sample rather than just the surface. An obvious method was NAA. A series of sediment samples containing particles of variable diameter were compared. Pellets containing a few mg of sediment were prepared from each sample and analyzed by the PIXE technique using both an absolute and a comparitive method. For INAA the rest of the sample was then irradiated with thermal neutrons and element concentrations determined from analyses of the subsequent gamma-ray spectrum. Results from the two methods are discussed.

  13. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-06-01

    In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the

  14. Real-Time Condition Monitoring and Fault Diagnosis of Gear Train Systems Using Instantaneous Angular Speed (IAS) Analysis

    NASA Astrophysics Data System (ADS)

    Sait, Abdulrahman S.

    This dissertation presents a reliable technique for monitoring the condition of rotating machinery by applying instantaneous angular speed (IAS) analysis. A new analysis of the effects of changes in the orientation of the line of action and the pressure angle of the resultant force acting on gear tooth profile of spur gear under different levels of tooth damage is utilized. The analysis and experimental work discussed in this dissertation provide a clear understating of the effects of damage on the IAS by analyzing the digital signals output of rotary incremental optical encoder. A comprehensive literature review of state of the knowledge in condition monitoring and fault diagnostics of rotating machinery, including gearbox system is presented. Progress and new developments over the past 30 years in failure detection techniques of rotating machinery including engines, bearings and gearboxes are thoroughly reviewed. This work is limited to the analysis of a gear train system with gear tooth surface faults utilizing angular motion analysis technique. Angular motion data were acquired using an incremental optical encoder. Results are compared to a vibration-based technique. The vibration data were acquired using an accelerometer. The signals were obtained and analyzed in the phase domains using signal averaging to determine the existence and position of faults on the gear train system. Forces between the mating teeth surfaces are analyzed and simulated to validate the influence of the presence of damage on the pressure angle and the IAS. National Instruments hardware is used and NI LabVIEW software code is developed for real-time, online condition monitoring systems and fault detection techniques. The sensitivity of optical encoders to gear fault detection techniques is experimentally investigated by applying IAS analysis under different gear damage levels and different operating conditions. A reliable methodology is developed for selecting appropriate testing/operating conditions of a rotating system to generate an alarm system for damage detection.

  15. TH-EF-BRC-04: Quality Management Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yorke, E.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  16. TH-EF-BRC-00: TG-100 Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  17. TH-EF-BRC-02: FMEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M.

    2016-06-15

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunscombe, P.

    This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less

  19. Techniques for detecting effects of urban and rural land-use practices on stream-water chemistry in selected watersheds in Texas, Minnesota,and Illinois

    USGS Publications Warehouse

    Walker, J.F.

    1993-01-01

    Selected statistical techniques were applied to three urban watersheds in Texas and Minnesota and three rural watersheds in Illinois. For the urban watersheds, single- and paired-site data-collection strategies were considered. The paired-site strategy was much more effective than the singlesite strategy for detecting changes. Analysis of storm load regression residuals demonstrated the potential utility of regressions for variability reduction. For the rural watersheds, none of the selected techniques were effective at identifying changes, primarily due to a small degree of management-practice implementation, potential errors introduced through the estimation of storm load, and small sample sizes. A Monte Carlo sensitivity analysis was used to determine the percent change in water chemistry that could be detected for each watershed. In most instances, the use of regressions improved the ability to detect changes.

  20. Interference by the activated sludge matrix on the analysis of soluble microbial products in wastewater.

    PubMed

    Potvin, Christopher M; Zhou, Hongde

    2011-11-01

    The objective of this study was to demonstrate the effects of complex matrix effects caused by chemical materials on the analysis of key soluble microbial products (SMP) including proteins, humics, carbohydrates, and polysaccharides in activated sludge samples. Emphasis was placed on comparison of the commonly used standard curve technique with standard addition (SA), a technique that differs in that the analytical responses are measured for sample solutions spiked with known quantities of analytes. The results showed that using SA provided a great improvement in compensating for SMP recovery and thus improving measurement accuracy by correcting for matrix effects. Analyte recovery was found to be highly dependent on sample dilution, and changed due to extraction techniques, storage conditions and sample composition. Storage of sample extracts by freezing changed SMP concentrations dramatically, as did storage at 4°C for as little as 1d. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Effective behaviour change techniques in smoking cessation interventions for people with chronic obstructive pulmonary disease: A meta-analysis

    PubMed Central

    Bartlett, Yvonne K; Sheeran, Paschal; Hawley, Mark S

    2014-01-01

    Purpose The purpose of this study was to identify the behaviour change techniques (BCTs) that are associated with greater effectiveness in smoking cessation interventions for people with chronic obstructive pulmonary disease (COPD). Methods A systematic review and meta-analysis was conducted. Web of Knowledge, CINAHL, EMBASE, PsycINFO, and MEDLINE were searched from the earliest date available to December 2012. Data were extracted and weighted average effect sizes calculated; BCTs used were coded according to an existing smoking cessation-specific BCT taxonomy. Results Seventeen randomized controlled trials (RCTs) were identified that involved a total sample of 7446 people with COPD. The sample-weighted mean quit rate for all RCTs was 13.19%, and the overall sample-weighted effect size was d+ = 0.33. Thirty-seven BCTs were each used in at least three interventions. Four techniques were associated with significantly larger effect sizes: Facilitate action planning/develop treatment plan, Prompt self-recording, Advise on methods of weight control, and Advise on/facilitate use of social support. Three new COPD-specific BCTs were identified, and Linking COPD and smoking was found to result in significantly larger effect sizes. Conclusions Smoking cessation interventions aimed at people with COPD appear to benefit from using techniques focussed on forming detailed plans and self-monitoring. Additional RCTs that use standardized reporting of intervention components and BCTs would be valuable to corroborate findings from the present meta-analysis. Statement of contribution What is already known on this subject? Chronic obstructive pulmonary disease (COPD) is responsible for considerable health and economic burden worldwide, and smoking cessation (SC) is the only known treatment that can slow the decline in lung function experienced. Previous reviews of smoking cessation interventions for this population have established that a combination of pharmacological support and behavioural counselling is most effective. While pharmacological support has been detailed, and effectiveness ranked, the content of behavioural counselling varies between interventions, and it is not clear what the most effective components are. What does this study add? Detailed description of ‘behavioural counselling’ component of SC interventions for people with COPD. Meta-analysis to identify effective behaviour change techniques tailored for this population. Discussion of these findings in the context of designing tailored SC interventions. PMID:24397814

  2. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  3. Choosing the Most Effective Pattern Classification Model under Learning-Time Constraint.

    PubMed

    Saito, Priscila T M; Nakamura, Rodrigo Y M; Amorim, Willian P; Papa, João P; de Rezende, Pedro J; Falcão, Alexandre X

    2015-01-01

    Nowadays, large datasets are common and demand faster and more effective pattern analysis techniques. However, methodologies to compare classifiers usually do not take into account the learning-time constraints required by applications. This work presents a methodology to compare classifiers with respect to their ability to learn from classification errors on a large learning set, within a given time limit. Faster techniques may acquire more training samples, but only when they are more effective will they achieve higher performance on unseen testing sets. We demonstrate this result using several techniques, multiple datasets, and typical learning-time limits required by applications.

  4. Automatic Target Recognition Classification System Evaluation Methodology

    DTIC Science & Technology

    2002-09-01

    Testing Set of Two-Class XOR Data (250 Samples)......................................... 2-59 2.27 Decision Analysis Process Flow Chart...ROC curve meta - analysis , which is the estimation of the true ROC curve of a given diagnostic system through ROC analysis across many studies or...technique can be very effective in sensitivity analysis ; trying to determine which data points have the most effect on the solution, and in

  5. Resolution of matrix effects on analysis of total and methyl mercury in aqueous samples from the Florida Everglades

    USGS Publications Warehouse

    Olson, M.L.; Cleckner, L.B.; Hurley, J.P.; Krabbenhoft, D.P.; Heelan, T.W.

    1997-01-01

    Aqueous samples from the Florida Everglades present several problems for the analysis of total mercury (HgT) and methyl mercury (MeHg). Constituents such as dissolved organic carbon (DOC) and sulfide at selected sites present particular challenges due to interferences with standard analytical techniques. This is manifested by 1) the inability to discern when bromine monochloride (BrCl) addition is sufficient for sample oxidation for HgT analysis; and 2) incomplete spike recoveries using the distillation/ethylation technique for MeHg analysis. Here, we suggest ultra-violet (UV) oxidation prior to addition of BrCl to ensure total oxidation of DOC prior to HgT analysis and copper sulfate (CuSO4) addition to aid in distillation in the presence of sulfide for MeHg analysis. Despite high chloride (Cl-) levels, we observed no effects on MeHg distillation/ethylation analyses. ?? Springer-Verlag 1997.

  6. Meta-analysis of studies assessing the efficacy of projective techniques in discriminating child sexual abuse.

    PubMed

    West, M M

    1998-11-01

    This meta-analysis of 12 studies assesses the efficacy of projective techniques to discriminate between sexually abused children and nonsexually abused children. A literature search was conducted to identify published studies that used projective instruments with sexually abused children. Those studies that reported statistics that allowed for an effect size to be calculated, were then included in the meta-analysis. There were 12 studies that fit the criteria. The projectives reviewed include The Rorschach, The Hand Test, The Thematic Apperception Test (TAT), the Kinetic Family Drawings, Human Figure Drawings, Draw Your Favorite Kind of Day, The Rosebush: A Visualization Strategy, and The House-Tree-Person. The results of this analysis gave an over-all effect size of d = .81, which is a large effect. Six studies included only a norm group of nondistressed, nonabused children with the sexual abuse group. The average effect size was d = .87, which is impressive. Six studies did include a clinical group of distressed nonsexually abused subjects and the effect size lowered to d = .76, which is a medium to large effect. This indicates that projective instruments can discriminate distressed children from nondistressed subjects, quite well. In the studies that included a clinical group of distressed children who were not sexually abused, the lower effect size indicates that the instruments were less able to discriminate the type of distress. This meta-analysis gives evidence that projective techniques have the ability to discriminate between children who have been sexually abused and those who were not abused sexually. However, further research that is designed to include clinical groups of distressed children is needed in order to determine how well the projectives can discriminate the type of distress.

  7. Microbiologically influenced corrosion: looking to the future.

    PubMed

    Videla, Héctor A; Herrera, Liz K

    2005-09-01

    This review discusses the state-of-the-art of research into biocorrosion and the biofouling of metals and alloys of industrial usage. The key concepts needed to understand the main effects of microorganisms on metal decay, and current trends in monitoring and control strategies to mitigate the deleterious effects of biocorrosion and biofouling are also described. Several relevant cases of biocorrosion studied by our research group are provided as examples: (i) biocorrosion of aluminum and its alloys by fungal contaminants of jet fuels; (ii) sulfate-reducing bacteria (SRB)-induced corrosion of steel; (iii) biocorrosion and biofouling interactions in the marine environment; (iv) monitoring strategies for assessing biocorrosion in industrial water systems; (v) microbial inhibition of corrosion; (vi) use and limitations of electrochemical techniques for evaluating biocorrosion effects. Future prospects in the field are described with respect to the potential of innovative techniques in microscopy (environmental scanning electron microscopy, confocal scanning laser microscopy, atomic force microscopy), new spectroscopic techniques for the study of corrosion products and biofilms (energy dispersion X-ray analysis, X-ray photoelectron spectroscopy, electron microprobe analysis) and electrochemistry (electrochemical impedance spectroscopy, electrochemical noise analysis).

  8. Image processing developments and applications for water quality monitoring and trophic state determination

    NASA Technical Reports Server (NTRS)

    Blackwell, R. J.

    1982-01-01

    Remote sensing data analysis of water quality monitoring is evaluated. Data anaysis and image processing techniques are applied to LANDSAT remote sensing data to produce an effective operational tool for lake water quality surveying and monitoring. Digital image processing and analysis techniques were designed, developed, tested, and applied to LANDSAT multispectral scanner (MSS) data and conventional surface acquired data. Utilization of these techniques facilitates the surveying and monitoring of large numbers of lakes in an operational manner. Supervised multispectral classification, when used in conjunction with surface acquired water quality indicators, is used to characterize water body trophic status. Unsupervised multispectral classification, when interpreted by lake scientists familiar with a specific water body, yields classifications of equal validity with supervised methods and in a more cost effective manner. Image data base technology is used to great advantage in characterizing other contributing effects to water quality. These effects include drainage basin configuration, terrain slope, soil, precipitation and land cover characteristics.

  9. Analysis techniques for momentum transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S.D.

    1991-08-01

    This report discusses the following topics on momentum analysis in tokamaks and stellarators: the momentum balance equation; deposition of torque by neutral beams; effects of toroidal rotation; and experimental observations. (LSP)

  10. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Optical Design And Analysis Of Carbon Dioxide Laser Fusion Systems Using Interferometry And Fast Fourier Transform Techniques

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. K.

    1980-11-01

    The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.

  12. Biostatistics Series Module 10: Brief Overview of Multivariate Methods.

    PubMed

    Hazra, Avijit; Gogtay, Nithya

    2017-01-01

    Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.

  13. Who Should Bear the Cost of Convenience? A Cost-effectiveness Analysis Comparing External Beam and Brachytherapy Radiotherapy Techniques for Early Stage Breast Cancer.

    PubMed

    McGuffin, M; Merino, T; Keller, B; Pignol, J-P

    2017-03-01

    Standard treatment for early breast cancer includes whole breast irradiation (WBI) after breast-conserving surgery. Recently, accelerated partial breast irradiation (APBI) has been proposed for well-selected patients. A cost and cost-effectiveness analysis was carried out comparing WBI with two APBI techniques. An activity-based costing method was used to determine the treatment cost from a societal perspective of WBI, high dose rate brachytherapy (HDR) and permanent breast seed implants (PBSI). A Markov model comparing the three techniques was developed with downstream costs, utilities and probabilities adapted from the literature. Sensitivity analyses were carried out for a wide range of variables, including treatment costs, patient costs, utilities and probability of developing recurrences. Overall, HDR was the most expensive ($14 400), followed by PBSI ($8700), with WBI proving the least expensive ($6200). The least costly method to the health care system was WBI, whereas PBSI and HDR were less costly for the patient. Under cost-effectiveness analyses, downstream costs added about $10 000 to the total societal cost of the treatment. As the outcomes are very similar between techniques, WBI dominated under cost-effectiveness analyses. WBI was found to be the most cost-effective radiotherapy technique for early breast cancer. However, both APBI techniques were less costly to the patient. Although innovation may increase costs for the health care system it can provide cost savings for the patient in addition to convenience. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  14. A Meta-Analysis of the Effects of Desegregation on Academic Achievement.

    ERIC Educational Resources Information Center

    Krol, Ronald A.

    Using meta analysis techniques from a 1977 article by G.V. Glass, this study sought to determine the effects of desegregation on academic achievement when students were grouped by academic subject, grade level, and length of desegregation. Data were obtained from 71 studies (conducted between 1955 and 1977) concerned with the effects of…

  15. Methodologies for Evaluating the Impact of Contraceptive Social Marketing Programs.

    ERIC Educational Resources Information Center

    Bertrand, Jane T.; And Others

    1989-01-01

    An overview of the evaluation issues associated with contraceptive social marketing programs is provided. Methodologies covered include survey techniques, cost-effectiveness analyses, retail audits of sales data, time series analysis, nested logit analysis, and discriminant analysis. (TJH)

  16. Multidisciplinary aeroelastic analysis of a generic hypersonic vehicle

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.; Petersen, K. L.

    1993-01-01

    This paper presents details of a flutter and stability analysis of aerospace structures such as hypersonic vehicles. Both structural and aerodynamic domains are discretized by the common finite element technique. A vibration analysis is first performed by the STARS code employing a block Lanczos solution scheme. This is followed by the generation of a linear aerodynamic grid for subsequent linear flutter analysis within subsonic and supersonic regimes of the flight envelope; the doublet lattice and constant pressure techniques are employed to generate the unsteady aerodynamic forces. Flutter analysis is then performed for several representative flight points. The nonlinear flutter solution is effected by first implementing a CFD solution of the entire vehicle. Thus, a 3-D unstructured grid for the entire flow domain is generated by a moving front technique. A finite element Euler solution is then implemented employing a quasi-implicit as well as an explicit solution scheme. A novel multidisciplinary analysis is next effected that employs modal and aerodynamic data to yield aerodynamic damping characteristics. Such analyses are performed for a number of flight points to yield a large set of pertinent data that define flight flutter characteristics of the vehicle. This paper outlines the finite-element-based integrated analysis procedures in detail, which is followed by the results of numerical analyses of flight flutter simulation.

  17. [Authentication of Trace Material Evidence in Forensic Science Field with Infrared Microscopic Technique].

    PubMed

    Jiang, Zhi-quan; Hu, Ke-liang

    2016-03-01

    In the field of forensic science, conventional infrared spectral analysis technique is usually unable to meet the detection requirements, because only very a few trace material evidence with diverse shapes and complex compositions, can be extracted from the crime scene. Infrared microscopic technique is developed based on a combination of Fourier-transform infrared spectroscopic technique and microscopic technique. Infrared microscopic technique has a lot of advantages over conventional infrared spectroscopic technique, such as high detection sensitivity, micro-area analysisand nondestructive examination. It has effectively solved the problem of authentication of trace material evidence in the field of forensic science. Additionally, almost no external interference is introduced during measurements by infrared microscopic technique. It can satisfy the special need that the trace material evidence must be reserved for witness in court. It is illustrated in detail through real case analysis in this experimental center that, infrared microscopic technique has advantages in authentication of trace material evidence in forensic science field. In this paper, the vibration features in infrared spectra of material evidences, including paints, plastics, rubbers, fibers, drugs and toxicants, can be comparatively analyzed by means of infrared microscopic technique, in an attempt to provide powerful spectroscopic evidence for qualitative diagnosis of various criminal and traffic accident cases. The experimental results clearly suggest that infrared microscopic technique has an incomparable advantage and it has become an effective method for authentication of trace material evidence in the field of forensic science.

  18. Novel near-infrared sampling apparatus for single kernel analysis of oil content in maize.

    PubMed

    Janni, James; Weinstock, B André; Hagen, Lisa; Wright, Steve

    2008-04-01

    A method of rapid, nondestructive chemical and physical analysis of individual maize (Zea mays L.) kernels is needed for the development of high value food, feed, and fuel traits. Near-infrared (NIR) spectroscopy offers a robust nondestructive method of trait determination. However, traditional NIR bulk sampling techniques cannot be applied successfully to individual kernels. Obtaining optimized single kernel NIR spectra for applied chemometric predictive analysis requires a novel sampling technique that can account for the heterogeneous forms, morphologies, and opacities exhibited in individual maize kernels. In this study such a novel technique is described and compared to less effective means of single kernel NIR analysis. Results of the application of a partial least squares (PLS) derived model for predictive determination of percent oil content per individual kernel are shown.

  19. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-05-30

    In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the

  20. Calculation of three-dimensional, inviscid, supersonic, steady flows

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1981-01-01

    A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.

  1. A tandem regression-outlier analysis of a ligand cellular system for key structural modifications around ligand binding.

    PubMed

    Lin, Ying-Ting

    2013-04-30

    A tandem technique of hard equipment is often used for the chemical analysis of a single cell to first isolate and then detect the wanted identities. The first part is the separation of wanted chemicals from the bulk of a cell; the second part is the actual detection of the important identities. To identify the key structural modifications around ligand binding, the present study aims to develop a counterpart of tandem technique for cheminformatics. A statistical regression and its outliers act as a computational technique for separation. A PPARγ (peroxisome proliferator-activated receptor gamma) agonist cellular system was subjected to such an investigation. Results show that this tandem regression-outlier analysis, or the prioritization of the context equations tagged with features of the outliers, is an effective regression technique of cheminformatics to detect key structural modifications, as well as their tendency of impact to ligand binding. The key structural modifications around ligand binding are effectively extracted or characterized out of cellular reactions. This is because molecular binding is the paramount factor in such ligand cellular system and key structural modifications around ligand binding are expected to create outliers. Therefore, such outliers can be captured by this tandem regression-outlier analysis.

  2. Acupuncture-Related Techniques for Psoriasis: A Systematic Review with Pairwise and Network Meta-Analyses of Randomized Controlled Trials.

    PubMed

    Yeh, Mei-Ling; Ko, Shu-Hua; Wang, Mei-Hua; Chi, Ching-Chi; Chung, Yu-Chu

    2017-12-01

    There has be a large body of evidence on the pharmacological treatments for psoriasis, but whether nonpharmacological interventions are effective in managing psoriasis remains largely unclear. This systematic review conducted pairwise and network meta-analyses to determine the effects of acupuncture-related techniques on acupoint stimulation for the treatment of psoriasis and to determine the order of effectiveness of these remedies. This study searched the following databases from inception to March 15, 2016: Medline, PubMed, Cochrane Central Register of Controlled Trials, EBSCO (including Academic Search Premier, American Doctoral Dissertations, and CINAHL), Airiti Library, and China National Knowledge Infrastructure. Randomized controlled trials (RCTs) on the effects of acupuncture-related techniques on acupoint stimulation as intervention for psoriasis were independently reviewed by two researchers. A total of 13 RCTs with 1,060 participants were included. The methodological quality of included studies was not rigorous. Acupoint stimulation, compared with nonacupoint stimulation, had a significant treatment for psoriasis. However, the most common adverse events were thirst and dry mouth. Subgroup analysis was further done to confirm that the short-term treatment effect was superior to that of the long-term effect in treating psoriasis. Network meta-analysis identified acupressure or acupoint catgut embedding, compared with medication, and had a significant effect for improving psoriasis. It was noted that acupressure was the most effective treatment. Acupuncture-related techniques could be considered as an alternative or adjuvant therapy for psoriasis in short term, especially of acupressure and acupoint catgut embedding. This study recommends further well-designed, methodologically rigorous, and more head-to-head randomized trials to explore the effects of acupuncture-related techniques for treating psoriasis.

  3. Extracranial glioblastoma diagnosed by examination of pleural effusion using the cell block technique: case report.

    PubMed

    Hori, Yusuke S; Fukuhara, Toru; Aoi, Mizuho; Oda, Kazunori; Shinno, Yoko

    2018-06-01

    Metastatic glioblastoma is a rare condition, and several studies have reported the involvement of multiple organs including the lymph nodes, liver, and lung. The lung and pleura are reportedly the most frequent sites of metastasis, and diagnosis using less invasive tools such as cytological analysis with fine needle aspiration biopsy is challenging. Cytological analysis of fluid specimens tends to be negative because of the small number of cells obtained, whereas the cell block technique reportedly has higher sensitivity because of a decrease in cellular dispersion. Herein, the authors describe a patient with a history of diffuse astrocytoma who developed intractable, progressive accumulation of pleural fluid. Initial cytological analysis of the pleural effusion obtained by thoracocentesis was negative, but reanalysis using the cell block technique revealed the presence of glioblastoma cells. This is the first report to suggest the effectiveness of the cell block technique in the diagnosis of extracranial glioblastoma using pleural effusion. In patients with a history of glioma, the presence of extremely intractable pleural effusion warrants cytological analysis of the fluid using this technique in order to initiate appropriate chemotherapy.

  4. Analysis of Complex Intervention Effects in Time-Series Experiments.

    ERIC Educational Resources Information Center

    Bower, Cathleen

    An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…

  5. Using Dual Regression to Investigate Network Shape and Amplitude in Functional Connectivity Analyses

    PubMed Central

    Nickerson, Lisa D.; Smith, Stephen M.; Öngür, Döst; Beckmann, Christian F.

    2017-01-01

    Independent Component Analysis (ICA) is one of the most popular techniques for the analysis of resting state FMRI data because it has several advantageous properties when compared with other techniques. Most notably, in contrast to a conventional seed-based correlation analysis, it is model-free and multivariate, thus switching the focus from evaluating the functional connectivity of single brain regions identified a priori to evaluating brain connectivity in terms of all brain resting state networks (RSNs) that simultaneously engage in oscillatory activity. Furthermore, typical seed-based analysis characterizes RSNs in terms of spatially distributed patterns of correlation (typically by means of simple Pearson's coefficients) and thereby confounds together amplitude information of oscillatory activity and noise. ICA and other regression techniques, on the other hand, retain magnitude information and therefore can be sensitive to both changes in the spatially distributed nature of correlations (differences in the spatial pattern or “shape”) as well as the amplitude of the network activity. Furthermore, motion can mimic amplitude effects so it is crucial to use a technique that retains such information to ensure that connectivity differences are accurately localized. In this work, we investigate the dual regression approach that is frequently applied with group ICA to assess group differences in resting state functional connectivity of brain networks. We show how ignoring amplitude effects and how excessive motion corrupts connectivity maps and results in spurious connectivity differences. We also show how to implement the dual regression to retain amplitude information and how to use dual regression outputs to identify potential motion effects. Two key findings are that using a technique that retains magnitude information, e.g., dual regression, and using strict motion criteria are crucial for controlling both network amplitude and motion-related amplitude effects, respectively, in resting state connectivity analyses. We illustrate these concepts using realistic simulated resting state FMRI data and in vivo data acquired in healthy subjects and patients with bipolar disorder and schizophrenia. PMID:28348512

  6. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  7. Applications of Advanced, Waveform Based AE Techniques for Testing Composite Materials

    NASA Technical Reports Server (NTRS)

    Prosser, William H.

    1996-01-01

    Advanced, waveform based acoustic emission (AE) techniques have been previously used to evaluate damage progression in laboratory tests of composite coupons. In these tests, broad band, high fidelity acoustic sensors were used to detect signals which were then digitized and stored for analysis. Analysis techniques were based on plate mode wave propagation characteristics. This approach, more recently referred to as Modal AE, provides an enhanced capability to discriminate and eliminate noise signals from those generated by damage mechanisms. This technique also allows much more precise source location than conventional, threshold crossing arrival time determination techniques. To apply Modal AE concepts to the interpretation of AE on larger composite structures, the effects of wave propagation over larger distances and through structural complexities must be well characterized and understood. In this research, measurements were made of the attenuation of the extensional and flexural plate mode components of broad band simulated AE signals in large composite panels. As these materials have applications in a cryogenic environment, the effects of cryogenic insulation on the attenuation of plate mode AE signals were also documented.

  8. Infrared spectroscopy as a screening technique for colitis

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Ghimire, Hemendra; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil

    2017-05-01

    There remains a great need for diagnosis of inflammatory bowel disease (IBD), for which the current technique, colonoscopy, is not cost-effective and presents a non-negligible risk for complications. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy is a new screening technique to evaluate colitis. Comparing infrared spectra of sera to study the differences between them can prove challenging due to the complexity of its biological constituents giving rise to a plethora of vibrational modes. Overcoming these inherent infrared spectral analysis difficulties involving highly overlapping absorbance peaks and the analysis of the data by curve fitting to improve the resolution is discussed. The proposed technique uses colitic and normal wild type mice dried serum to obtain ATR/FTIR spectra to effectively differentiate colitic mice from normal mice. Using this method, Amide I group frequency (specifically, alpha helix to beta sheet ratio of the protein secondary structure) was identified as disease associated spectral signature in addition to the previously reported glucose and mannose signatures in sera of chronic and acute mice models of colitis. Hence, this technique will be able to identify changes in the sera due to various diseases.

  9. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Practical Guidance for Conducting Mediation Analysis With Multiple Mediators Using Inverse Odds Ratio Weighting

    PubMed Central

    Nguyen, Quynh C.; Osypuk, Theresa L.; Schmidt, Nicole M.; Glymour, M. Maria; Tchetgen Tchetgen, Eric J.

    2015-01-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994–2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. PMID:25693776

  11. Maltodextrin: a novel excipient used in sugar-based orally disintegrating tablets and phase transition process.

    PubMed

    Elnaggar, Yosra Shaaban R; El-Massik, Magda A; Abdallah, Ossama Y; Ebian, Abd Elazim R

    2010-06-01

    The recent challenge in orally disintegrating tablets (ODT) manufacturing encompasses the compromise between instantaneous disintegration, sufficient hardness, and standard processing equipment. The current investigation constitutes one attempt to fulfill this challenge. Maltodextrin, in the present work, was utilized as a novel excipient to prepare ODT of meclizine. Tablets were prepared by both direct compression and wet granulation techniques. The effect of maltodextrin concentrations on ODT characteristics--manifested as hardness and disintegration time--was studied. The effect of conditioning (40 degrees C and 75% relative humidity) as a post-compression treatment on ODT characteristics was also assessed. Furthermore, maltodextrin-pronounced hardening effect was investigated using differential scanning calorimetry (DSC) and X-ray analysis. Results revealed that in both techniques, rapid disintegration (30-40 s) would be achieved on the cost of tablet hardness (about 1 kg). Post-compression conditioning of tablets resulted in an increase in hardness (3 kg), while keeping rapid disintegration (30-40 s) according to guidance of the FDA for ODT. However, direct compression-conditioning technique exhibited drawbacks of long conditioning time and appearance of the so-called patch effect. These problems were, yet, absent in wet granulation-conditioning technique. DSC and X-ray analysis suggested involvement of glass-elastic deformation in maltodextrin hardening effect. High-performance liquid chromatography analysis of meclizine ODT suggested no degradation of the drug by the applied conditions of temperature and humidity. Overall results proposed that maltodextrin is a promising saccharide for production of ODT with accepted hardness-disintegration time compromise, utilizing standard processing equipment and phenomena of phase transition.

  12. EXPERIMENTAL MODELLING OF AORTIC ANEURYSMS

    PubMed Central

    Doyle, Barry J; Corbett, Timothy J; Cloonan, Aidan J; O’Donnell, Michael R; Walsh, Michael T; Vorp, David A; McGloughlin, Timothy M

    2009-01-01

    A range of silicone rubbers were created based on existing commercially available materials. These silicones were designed to be visually different from one another and have distinct material properties, in particular, ultimate tensile strengths and tear strengths. In total, eleven silicone rubbers were manufactured, with the materials designed to have a range of increasing tensile strengths from approximately 2-4MPa, and increasing tear strengths from approximately 0.45-0.7N/mm. The variations in silicones were detected using a standard colour analysis technique. Calibration curves were then created relating colour intensity to individual material properties. All eleven materials were characterised and a 1st order Ogden strain energy function applied. Material coefficients were determined and examined for effectiveness. Six idealised abdominal aortic aneurysm models were also created using the two base materials of the study, with a further model created using a new mixing technique to create a rubber model with randomly assigned material properties. These models were then examined using videoextensometry and compared to numerical results. Colour analysis revealed a statistically significant linear relationship (p<0.0009) with both tensile strength and tear strength, allowing material strength to be determined using a non-destructive experimental technique. The effectiveness of this technique was assessed by comparing predicted material properties to experimentally measured methods, with good agreement in the results. Videoextensometry and numerical modelling revealed minor percentage differences, with all results achieving significance (p<0.0009). This study has successfully designed and developed a range of silicone rubbers that have unique colour intensities and material strengths. Strengths can be readily determined using a non-destructive analysis technique with proven effectiveness. These silicones may further aid towards an improved understanding of the biomechanical behaviour of aneurysms using experimental techniques. PMID:19595622

  13. Inferior or double joint spaces injection versus superior joint space injection for temporomandibular disorders: a systematic review and meta-analysis.

    PubMed

    Li, Chunjie; Zhang, Yifan; Lv, Jun; Shi, Zongdao

    2012-01-01

    To compare the effect and safety of inferior or double temporomandibular joint spaces drug injection versus superior temporomandibular joint space injection in the treatment of temporomandibular disorders. MEDLINE (via Ovid, 1948 to March 2011), CENTRAL (Issue 1, 2011), Embase (1984 to March 2011), CBM (1978 to March 2011), and World Health Organization International Clinical Trials Registry Platform were searched electronically; relevant journals as well as references of included studies were hand-searched for randomized controlled trials comparing effect or safety of inferior or double joint spaces drug injection technique with those of superior space injection technique. Risk of bias assessment with the tool recommended by Cochrane Collaboration, reporting quality assessment with CONSORT and data extraction, were carried out independently by 2 reviewers. Meta-analysis was delivered with RevMan 5.0.23. Four trials with 349 participants were included. All the included studies had moderate risk of bias. Meta-analysis showed that inferior or double spaces injection technique could significantly increase 2.88 mm more maximal mouth opening (P = .0001) and alleviate pain intensity in the temporomandibular area on average by 9.01 mm visual analog scale scores (P = .0001) compared with superior space injection technique, but could not markedly change synthesized clinical index (P = .05) in the short term; nevertheless, they showed more beneficial maximal mouth opening (P = .002), pain relief (P < .0001), and synthesized clinical variable (P < .0001) in the long term than superior space injection. No serious adverse events were reported. Inferior or double temporomandibular joint spaces drug injection technique shows better effect than superior space injection technique, and their safety is affirmative. However, more high-quality studies are still needed to test and verify the evidence. Crown Copyright © 2012. Published by Elsevier Inc. All rights reserved.

  14. Quantization error of CCD cameras and their influence on phase calculation in fringe pattern analysis.

    PubMed

    Skydan, Oleksandr A; Lilley, Francis; Lalor, Michael J; Burton, David R

    2003-09-10

    We present an investigation into the phase errors that occur in fringe pattern analysis that are caused by quantization effects. When acquisition devices with a limited value of camera bit depth are used, there are a limited number of quantization levels available to record the signal. This may adversely affect the recorded signal and adds a potential source of instrumental error to the measurement system. Quantization effects also determine the accuracy that may be achieved by acquisition devices in a measurement system. We used the Fourier fringe analysis measurement technique. However, the principles can be applied equally well for other phase measuring techniques to yield a phase error distribution that is caused by the camera bit depth.

  15. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake: Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  16. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake - Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  17. Analysis of polarization in hydrogen bonded complexes: An asymptotic projection approach

    NASA Astrophysics Data System (ADS)

    Drici, Nedjoua

    2018-03-01

    The asymptotic projection technique is used to investigate the polarization effect that arises from the interaction between the relaxed, and frozen monomeric charge densities of a set of neutral and charged hydrogen bonded complexes. The AP technique based on the resolution of the original Kohn-Sham equations can give an acceptable qualitative description of the polarization effect in neutral complexes. The significant overlap of the electron densities, in charged and π-conjugated complexes, impose further development of a new functional, describing the coupling between constrained and non-constrained electron densities within the AP technique to provide an accurate representation of the polarization effect.

  18. Instantiating the art of war for effects-based operations

    NASA Astrophysics Data System (ADS)

    Burns, Carla L.

    2002-07-01

    Effects-Based Operations (EBO) is a mindset, a philosophy and an approach for planning, executing and assessing military operations for the effects they produce rather than the targets or even objectives they deal with. An EBO approach strives to provide economy of force, dynamic tasking, and reduced collateral damage. The notion of EBO is not new. Military Commanders certainly have desired effects in mind when conducting military operations. However, to date EBO has been an art of war that lacks automated techniques and tools that enable effects-based analysis and assessment. Modeling and simulation is at the heart of this challenge. The Air Force Research Laboratory (AFRL) EBO Program is developing modeling techniques and corresponding tool capabilities that can be brought to bear against the challenges presented by effects-based analysis and assessment. Effects-based course-of-action development, center of gravity/target system analysis, and wargaming capabilities are being developed and integrated to help give Commanders the information decision support required to achieve desired national security objectives. This paper presents an introduction to effects-based operations, discusses the benefits of an EBO approach, and focuses on modeling and analysis for effects-based strategy development. An overview of modeling and simulation challenges for EBO is presented, setting the stage for the detailed technical papers in the subject session.

  19. Measuring Response to Intervention: Comparing Three Effect Size Calculation Techniques for Single-Case Design Analysis

    ERIC Educational Resources Information Center

    Ross, Sarah Gwen

    2012-01-01

    Response to intervention (RTI) is increasingly being used in educational settings to make high-stakes, special education decisions. Because of this, the accurate use and analysis of single-case designs to monitor intervention effectiveness has become important to the RTI process. Effect size methods for single-case designs provide a useful way to…

  20. Planning for Cost Effectiveness.

    ERIC Educational Resources Information Center

    Schlaebitz, William D.

    1984-01-01

    A heat pump life-cycle cost analysis is used to explain the technique. Items suggested for the life-cycle analysis approach include lighting, longer-life batteries, site maintenance, and retaining experts to inspect specific building components. (MLF)

  1. A study of the feasibility of statistical analysis of airport performance simulation

    NASA Technical Reports Server (NTRS)

    Myers, R. H.

    1982-01-01

    The feasibility of conducting a statistical analysis of simulation experiments to study airport capacity is investigated. First, the form of the distribution of airport capacity is studied. Since the distribution is non-Gaussian, it is important to determine the effect of this distribution on standard analysis of variance techniques and power calculations. Next, power computations are made in order to determine how economic simulation experiments would be if they are designed to detect capacity changes from condition to condition. Many of the conclusions drawn are results of Monte-Carlo techniques.

  2. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  3. Integrated analysis of remote sensing products from basic geological surveys. [Brazil

    NASA Technical Reports Server (NTRS)

    Dasilvafagundesfilho, E. (Principal Investigator)

    1984-01-01

    Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.

  4. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    NASA Astrophysics Data System (ADS)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  5. Can cognitive processes help explain the success of instructional techniques recommended by behavior analysts?

    NASA Astrophysics Data System (ADS)

    Markovits, Rebecca A.; Weinstein, Yana

    2018-01-01

    The fields of cognitive psychology and behavior analysis have undertaken separate investigations into effective learning strategies. These studies have led to several recommendations from both fields regarding teaching techniques that have been shown to enhance student performance. While cognitive psychology and behavior analysis have studied student performance independently from their different perspectives, the recommendations they make are remarkably similar. The lack of discussion between the two fields, despite these similarities, is surprising. The current paper seeks to remedy this oversight in two ways: first, by reviewing two techniques recommended by behavior analysts—guided notes and response cards—and comparing them to their counterparts in cognitive psychology that are potentially responsible for their effectiveness; and second, by outlining some other areas of overlap that could benefit from collaboration. By starting the discussion with the comparison of two specific recommendations for teaching techniques, we hope to galvanize a more extensive collaboration that will not only further the progression of both fields, but also extend the practical applications of the ensuing research.

  6. Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.

    PubMed

    Ritz, Christian; Van der Vliet, Leana

    2009-09-01

    The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.

  7. Studies of EGRET sources with a novel image restoration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, Hiroyasu; Cohen-Tanugi, Johann; Kamae, Tuneyoshi

    2007-07-12

    We have developed an image restoration technique based on the Richardson-Lucy algorithm optimized for GLAST-LAT image analysis. Our algorithm is original since it utilizes the PSF (point spread function) that is calculated for each event. This is critical for EGRET and GLAST-LAT image analysis since the PSF depends on the energy and angle of incident gamma-rays and varies by more than one order of magnitude. EGRET and GLAST-LAT image analysis also faces Poisson noise due to low photon statistics. Our technique incorporates wavelet filtering to minimize noise effects. We present studies of EGRET sources using this novel image restoration techniquemore » for possible identification of extended gamma-ray sources.« less

  8. Physiologic Waveform Analysis for Early Detection of Hemorrhage during Transport and Higher Echelon Medical Care of Combat Casualties

    DTIC Science & Technology

    2014-03-01

    waveforms that are easier to measure than ABP (e.g., pulse oximeter waveforms); (3) a NIH SBIR Phase I proposal with Retia Medical to develop automated...the training dataset. Integrating the technique with non-invasive pulse transit time (PTT) was most effective. The integrated technique specifically...the peripheral ABP waveforms in the training dataset. These techniques included the rudimentary mean ABP technique, the classic pulse pressure times

  9. A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.

    The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less

  10. Mixed Models and Reduction Techniques for Large-Rotation, Nonlinear Analysis of Shells of Revolution with Application to Tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.; Tanner, J. A.

    1984-01-01

    An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.

  11. Characterization of emission microscopy and liquid crystal thermography in IC fault localization

    NASA Astrophysics Data System (ADS)

    Lau, C. K.; Sim, K. S.

    2013-05-01

    This paper characterizes two fault localization techniques - Emission Microscopy (EMMI) and Liquid Crystal Thermography (LCT) by using integrated circuit (IC) leakage failures. The majority of today's semiconductor failures do not reveal a clear visual defect on the die surface and therefore require fault localization tools to identify the fault location. Among the various fault localization tools, liquid crystal thermography and frontside emission microscopy are commonly used in most semiconductor failure analysis laboratories. Many people misunderstand that both techniques are the same and both are detecting hot spot in chip failing with short or leakage. As a result, analysts tend to use only LCT since this technique involves very simple test setup compared to EMMI. The omission of EMMI as the alternative technique in fault localization always leads to incomplete analysis when LCT fails to localize any hot spot on a failing chip. Therefore, this research was established to characterize and compare both the techniques in terms of their sensitivity in detecting the fault location in common semiconductor failures. A new method was also proposed as an alternative technique i.e. the backside LCT technique. The research observed that both techniques have successfully detected the defect locations resulted from the leakage failures. LCT wass observed more sensitive than EMMI in the frontside analysis approach. On the other hand, EMMI performed better in the backside analysis approach. LCT was more sensitive in localizing ESD defect location and EMMI was more sensitive in detecting non ESD defect location. Backside LCT was proven to work as effectively as the frontside LCT and was ready to serve as an alternative technique to the backside EMMI. The research confirmed that LCT detects heat generation and EMMI detects photon emission (recombination radiation). The analysis results also suggested that both techniques complementing each other in the IC fault localization. It is necessary for a failure analyst to use both techniques when one of the techniques produces no result.

  12. The Impact of Multiple Endpoint Dependency on "Q" and "I"[superscript 2] in Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-01-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures "Q" and…

  13. Establishing a Common Vocabulary of Key Concepts for the Effective Implementation of Applied Behavior Analysis

    ERIC Educational Resources Information Center

    Cihon, Traci M.; Cihon, Joseph H.; Bedient, Guy M.

    2016-01-01

    The technical language of behavior analysis is arguably necessary to share ideas and research with precision among each other. However, it can hinder effective implementation of behavior analytic techniques when it prevents clear communication between the supervising behavior analyst and behavior technicians. The present paper provides a case…

  14. The Use of Cognitive Task Analysis to Capture Expertise for Tracheal Extubation Training in Anesthesiology

    ERIC Educational Resources Information Center

    Embrey, Karen K.

    2012-01-01

    Cognitive task analysis (CTA) is a knowledge elicitation technique employed for acquiring expertise from domain specialists to support the effective instruction of novices. CTA guided instruction has proven effective in improving surgical skills training for medical students and surgical residents. The standard, current method of teaching clinical…

  15. Proposal for the Development of a Standardized Protocol for Assessing the Economic Costs of HIV Prevention Interventions

    PubMed Central

    Pinkerton, Steven D.; Pearson, Cynthia R.; Eachus, Susan R.; Berg, Karina M.; Grimes, Richard M.

    2008-01-01

    Summary Maximizing our economic investment in HIV prevention requires balancing the costs of candidate interventions against their effects and selecting the most cost-effective interventions for implementation. However, many HIV prevention intervention trials do not collect cost information, and those that do use a variety of cost data collection methods and analysis techniques. Standardized cost data collection procedures, instrumentation, and analysis techniques are needed to facilitate the task of assessing intervention costs and to ensure comparability across intervention trials. This article describes the basic elements of a standardized cost data collection and analysis protocol and outlines a computer-based approach to implementing this protocol. Ultimately, the development of such a protocol would require contributions and “buy-in” from a diverse range of stakeholders, including HIV prevention researchers, cost-effectiveness analysts, community collaborators, public health decision makers, and funding agencies. PMID:18301128

  16. Fourier transform infrared microspectroscopy for the analysis of the biochemical composition of C. elegans worms.

    PubMed

    Sheng, Ming; Gorzsás, András; Tuck, Simon

    2016-01-01

    Changes in intermediary metabolism have profound effects on many aspects of C. elegans biology including growth, development and behavior. However, many traditional biochemical techniques for analyzing chemical composition require relatively large amounts of starting material precluding the analysis of mutants that cannot be grown in large amounts as homozygotes. Here we describe a technique for detecting changes in the chemical compositions of C. elegans worms by Fourier transform infrared microspectroscopy. We demonstrate that the technique can be used to detect changes in the relative levels of carbohydrates, proteins and lipids in one and the same worm. We suggest that Fourier transform infrared microspectroscopy represents a useful addition to the arsenal of techniques for metabolic studies of C. elegans worms.

  17. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    PubMed

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P < .001 and P < .003, respectively). Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  18. Tourism English Teaching Techniques Converged from Two Different Angles.

    ERIC Educational Resources Information Center

    Seong, Myeong-Hee

    2001-01-01

    Provides techniques converged from two different angles (learners and tourism English features) for effective tourism English teaching in a junior college in Korea. Used a questionnaire, needs analysis, an instrument for measuring learners' strategies for oral communication, a small-scale classroom study for learners' preferred teaching…

  19. Retest Reliability of the Rosenzweig Picture-Frustration Study and Similar Semiprojective Techniques

    ERIC Educational Resources Information Center

    Rosenzweig, Saul; And Others

    1975-01-01

    The research dealing with the reliability of the Rosenzweig Picture-Frustration Study is surveyed. Analysis of various split-half, and retest procedures are reviewed and their relative effectiveness evaluated. Reliability measures as applied to projective techniques in general are discussed. (Author/DEP)

  20. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  1. Waveforms and Sonic Boom Perception and Response (WSPR): Low-Boom Community Response Program Pilot Test Design, Execution, and Analysis

    NASA Technical Reports Server (NTRS)

    Page, Juliet A.; Hodgdon, Kathleen K.; Krecker, Peg; Cowart, Robbie; Hobbs, Chris; Wilmer, Clif; Koening, Carrie; Holmes, Theresa; Gaugler, Trent; Shumway, Durland L.; hide

    2014-01-01

    The Waveforms and Sonic boom Perception and Response (WSPR) Program was designed to test and demonstrate the applicability and effectiveness of techniques to gather data relating human subjective response to multiple low-amplitude sonic booms. It was in essence a practice session for future wider scale testing on naive communities, using a purpose built low-boom demonstrator aircraft. The low-boom community response pilot experiment was conducted in California in November 2011. The WSPR team acquired sufficient data to assess and evaluate the effectiveness of the various physical and psychological data gathering techniques and analysis methods.

  2. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    PubMed

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p < 0.001), whereas the effect size for combined controls was 0.41 (95% confidence interval, 0.17-0.67; p = 0.001). Emotional freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  3. Inverse analysis of aerodynamic loads from strain information using structural models and neural networks

    NASA Astrophysics Data System (ADS)

    Wada, Daichi; Sugimoto, Yohei

    2017-04-01

    Aerodynamic loads on aircraft wings are one of the key parameters to be monitored for reliable and effective aircraft operations and management. Flight data of the aerodynamic loads would be used onboard to control the aircraft and accumulated data would be used for the condition-based maintenance and the feedback for the fatigue and critical load modeling. The effective sensing techniques such as fiber optic distributed sensing have been developed and demonstrated promising capability of monitoring structural responses, i.e., strains on the surface of the aircraft wings. By using the developed techniques, load identification methods for structural health monitoring are expected to be established. The typical inverse analysis for load identification using strains calculates the loads in a discrete form of concentrated forces, however, the distributed form of the loads is essential for the accurate and reliable estimation of the critical stress at structural parts. In this study, we demonstrate an inverse analysis to identify the distributed loads from measured strain information. The introduced inverse analysis technique calculates aerodynamic loads not in a discrete but in a distributed manner based on a finite element model. In order to verify the technique through numerical simulations, we apply static aerodynamic loads on a flat panel model, and conduct the inverse identification of the load distributions. We take two approaches to build the inverse system between loads and strains. The first one uses structural models and the second one uses neural networks. We compare the performance of the two approaches, and discuss the effect of the amount of the strain sensing information.

  4. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. Copyright © 2016. Published by Elsevier Ltd.

  5. Efficient calculation of the polarizability: a simplified effective-energy technique

    NASA Astrophysics Data System (ADS)

    Berger, J. A.; Reining, L.; Sottile, F.

    2012-09-01

    In a recent publication [J.A. Berger, L. Reining, F. Sottile, Phys. Rev. B 82, 041103(R) (2010)] we introduced the effective-energy technique to calculate in an accurate and numerically efficient manner the GW self-energy as well as the polarizability, which is required to evaluate the screened Coulomb interaction W. In this work we show that the effective-energy technique can be used to further simplify the expression for the polarizability without a significant loss of accuracy. In contrast to standard sum-over-state methods where huge summations over empty states are required, our approach only requires summations over occupied states. The three simplest approximations we obtain for the polarizability are explicit functionals of an independent- or quasi-particle one-body reduced density matrix. We provide evidence of the numerical accuracy of this simplified effective-energy technique as well as an analysis of our method.

  6. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis.

    PubMed

    van Genugten, Lenneke; Dusseldorp, Elise; Massey, Emma K; van Empelen, Pepijn

    2017-03-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary prevention interventions on mental wellbeing in adolescents. Forty interventions were included in the analyses. Techniques were coded into nine categories of SRTs. Meta-analyses were conducted to identify the effectiveness of SRTs, examining three different outcomes: internalising behaviour, externalising behaviour, and self-esteem. Primary interventions had a small-to-medium ([Formula: see text] = 0.16-0.29) on self-esteem and internalising behaviour. Secondary interventions had a medium-to-large short-term effect (average [Formula: see text] = 0.56) on internalising behaviour and self-esteem. In secondary interventions, interventions including asking for social support [Formula: see text] 95% confidence interval, CI = 1.11-1.98) had a great effect on internalising behaviour. Interventions including monitoring and evaluation had a greater effect on self-esteem [Formula: see text] 95% CI = 0.21-0.57). For primary interventions, there was not a single SRT that was associated with a greater intervention effect on internalising behaviour or self-esteem. No effects were found for externalising behaviours. Self-regulation interventions are moderately effective at improving mental wellbeing among adolescents. Secondary interventions promoting 'asking for social support' and promoting 'monitoring and evaluation' were associated with improved outcomes. More research is needed to identify other SRTs or combinations of SRTs that could improve understanding or optimise mental wellbeing interventions.

  7. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  8. Influence of cross section variations on the structural behaviour of composite rotor blades

    NASA Astrophysics Data System (ADS)

    Rapp, Helmut; Woerndle, Rudolf

    1991-09-01

    A highly sophisticated structural analysis is required for helicopter rotor blades with nonhomogeneous cross sections made from nonisotropic material. Combinations of suitable analytical techniques with FEM-based techniques permit a cost effective and sufficiently accurate analysis of these complicated structures. It is determined that in general the 1D engineering theory of bending combined with 2D theories for determining the cross section properties is sufficient to describe the structural blade behavior.

  9. Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis.

    PubMed

    Cohnstaedt, Lee W; Rochon, Kateryn; Duehl, Adrian J; Anderson, John F; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C; Obenauer, Peter J; Campbell, James F; Lysyk, Tim J; Allan, Sandra A

    2012-03-01

    Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium "Advancements in arthropod monitoring technology, techniques, and analysis" presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles.

  10. Optical analysis of thermal induced structural distortions

    NASA Technical Reports Server (NTRS)

    Weinswig, Shepard; Hookman, Robert A.

    1991-01-01

    The techniques used for the analysis of thermally induced structural distortions of optical components such as scanning mirrors and telescope optics are outlined. Particular attention is given to the methodology used in the thermal and structural analysis of the GOES scan mirror, the optical analysis using Zernike coefficients, and the optical system performance evaluation. It is pointed out that the use of Zernike coefficients allows an accurate, effective, and simple linkage between thermal/mechanical effects and the optical design.

  11. Health Lifestyles: Audience Segmentation Analysis for Public Health Interventions.

    ERIC Educational Resources Information Center

    Slater, Michael D.; Flora, June A.

    This paper is concerned with the application of market research techniques to segment large populations into homogeneous units in order to improve the reach, utilization, and effectiveness of health programs. The paper identifies seven distinctive patterns of health attitudes, social influences, and behaviors using cluster analytic techniques in a…

  12. Analysis of defect structure in silicon. Effect of grain boundary density on carrier mobility in UCP material

    NASA Technical Reports Server (NTRS)

    Dunn, J.; Stringfellow, G. B.; Natesh, R.

    1982-01-01

    The relationships between hole mobility and grain boundary density were studied. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using a quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  13. Linear Programming for Vocational Education Planning. Interim Report.

    ERIC Educational Resources Information Center

    Young, Robert C.; And Others

    The purpose of the paper is to define for potential users of vocational education management information systems a quantitative analysis technique and its utilization to facilitate more effective planning of vocational education programs. Defining linear programming (LP) as a management technique used to solve complex resource allocation problems…

  14. Arthropod surveillance programs: Basic components, strategies, and analysis

    USDA-ARS?s Scientific Manuscript database

    Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthro...

  15. Meta-Analysis: Application to Clinical Dentistry and Dental Education.

    ERIC Educational Resources Information Center

    Cohen, Peter A.

    1992-01-01

    Meta-analysis is proposed as an effective alternative to conventional narrative review for extracting trends from research findings. This type of analysis is explained, advantages over more traditional review techniques are discussed, basic procedures and limitations are outlined, and potential applications in dental education and clinical…

  16. Financial Ratio Analysis Comes to Nonprofits.

    ERIC Educational Resources Information Center

    Chabotar, Kent John

    1989-01-01

    To evaluate their financial health, a growing number of colleges, universities, and other nonprofit organizations are using financial ratio analysis, a technique used in business. The strengths and weaknesses of ratio analysis are assessed and suggestions are made on how nonprofits can use it most effectively. (Author/MLW)

  17. Spectroscopic analysis of solar and cosmic X-ray spectra. 1: The nature of cosmic X-ray spectra and proposed analytical techniques

    NASA Technical Reports Server (NTRS)

    Walker, A. B. C., Jr.

    1975-01-01

    Techniques for the study of the solar corona are reviewed as an introduction to a discussion of modifications required for the study of cosmic sources. Spectroscopic analysis of individual sources and the interstellar medium is considered. The latter was studied via analysis of its effect on the spectra of selected individual sources. The effects of various characteristics of the ISM, including the presence of grains, molecules, and ionization, are first discussed, and the development of ISM models is described. The expected spectral structure of individual cosmic sources is then reviewed with emphasis on supernovae remnants and binary X-ray sources. The observational and analytical requirements imposed by the characteristics of these sources are identified, and prospects for the analysis of abundances and the study of physical parameters within them are assessed. Prospects for the spectroscopic study of other classes of X-ray sources are also discussed.

  18. Effect size calculation in meta-analyses of psychotherapy outcome research.

    PubMed

    Hoyt, William T; Del Re, A C

    2018-05-01

    Meta-analysis of psychotherapy intervention research normally examines differences between treatment groups and some form of comparison group (e.g., wait list control; alternative treatment group). The effect of treatment is normally quantified as a standardized mean difference (SMD). We describe procedures for computing unbiased estimates of the population SMD from sample data (e.g., group Ms and SDs), and provide guidance about a number of complications that may arise related to effect size computation. These complications include (a) incomplete data in research reports; (b) use of baseline data in computing SMDs and estimating the population standard deviation (σ); (c) combining effect size data from studies using different research designs; and (d) appropriate techniques for analysis of data from studies providing multiple estimates of the effect of interest (i.e., dependent effect sizes). Clinical or Methodological Significance of this article: Meta-analysis is a set of techniques for producing valid summaries of existing research. The initial computational step for meta-analyses of research on intervention outcomes involves computing an effect size quantifying the change attributable to the intervention. We discuss common issues in the computation of effect sizes and provide recommended procedures to address them.

  19. Comparative analysis of 2D spatio-temporal visualisation techniques for the pulsed THz-radiation field using an electro-optic crystal

    NASA Astrophysics Data System (ADS)

    Ushakov, A. A.; Chizhov, P. A.; Bukin, V. V.; Garnov, S. V.; Savel'ev, A. B.

    2018-05-01

    Two 2D techniques for visualising the field of pulsed THz radiation ('shadow' and 'interferometric'), which are based on the linear electro-optical effect with application of a ZnTe detector crystal 1 × 1 cm in size, are compared. The noise level and dynamic range for the aforementioned techniques are analysed and their applicability limits are discussed.

  20. A Comparison of the Glass Meta-Analytic Technique with the Hunter-Schmidt Meta-Analytic Technique on Three Studies from the Education Literature.

    ERIC Educational Resources Information Center

    Hough, Susan L.; Hall, Bruce W.

    The meta-analytic techniques of G. V. Glass (1976) and J. E. Hunter and F. L. Schmidt (1977) were compared through their application to three meta-analytic studies from education literature. The following hypotheses were explored: (1) the overall mean effect size would be larger in a Hunter-Schmidt meta-analysis (HSMA) than in a Glass…

  1. Second principle approach to the analysis of unsteady flow and heat transfer in a tube with arc-shaped corrugation

    NASA Astrophysics Data System (ADS)

    Pagliarini, G.; Vocale, P.; Mocerino, A.; Rainieri, S.

    2017-01-01

    Passive convective heat transfer enhancement techniques are well known and widespread tool for increasing the efficiency of heat transfer equipment. In spite of the ability of the first principle approach to forecast the macroscopic effects of the passive techniques for heat transfer enhancement, namely the increase of both the overall heat exchanged and the head losses, a first principle analysis based on energy, momentum and mass local conservation equations is hardly able to give a comprehensive explanation of how local modifications in the boundary layers contribute to the overall effect. A deeper insight on the heat transfer enhancement mechanisms can be instead obtained within a second principle approach, through the analysis of the local exergy dissipation phenomena which are related to heat transfer and fluid flow. To this aim, the analysis based on the second principle approach implemented through a careful consideration of the local entropy generation rate seems the most suitable, since it allows to identify more precisely the cause of the loss of efficiency in the heat transfer process, thus providing a useful guide in the choice of the most suitable heat transfer enhancement techniques.

  2. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  3. TRAC Innovative Visualization Techniques

    DTIC Science & Technology

    2016-11-14

    Therefore, TRAC analysts need a way to analyze the effectiveness of their visualization design choices. Currently, TRAC does not have a methodology ...to analyze visualizations used to support an analysis story. Our research team developed a visualization design methodology to create effective...visualizations that support an analysis story. First, we based our methodology on the latest research on design thinking, cognitive learning, and

  4. The evaluation of meta-analysis techniques for quantifying prescribed fire effects on fuel loadings.

    Treesearch

    Karen E. Kopper; Donald McKenzie; David L. Peterson

    2009-01-01

    Models and effect-size metrics for meta-analysis were compared in four separate meta-analyses quantifying surface fuels after prescribed fires in ponderosa pine (Pinus ponderosa Dougl. ex Laws.) forests of the Western United States. An aggregated data set was compiled from eight published reports that contained data from 65 fire treatment units....

  5. Automated X-Ray Diffraction of Irradiated Materials

    DOE PAGES

    Rodman, John; Lin, Yuewei; Sprouster, David; ...

    2017-10-26

    Synchrotron-based X-ray diffraction (XRD) and small-angle Xray scattering (SAXS) characterization techniques used on unirradiated and irradiated reactor pressure vessel steels yield large amounts of data. Machine learning techniques, including PCA, offer a novel method of analyzing and visualizing these large data sets in order to determine the effects of chemistry and irradiation conditions on the formation of radiation induced precipitates. In order to run analysis on these data sets, preprocessing must be carried out to convert the data to a usable format and mask the 2-D detector images to account for experimental variations. Once the data has been preprocessed, itmore » can be organized and visualized using principal component analysis (PCA), multi-dimensional scaling, and k-means clustering. In conclusion, from these techniques, it is shown that sample chemistry has a notable effect on the formation of the radiation induced precipitates in reactor pressure vessel steels.« less

  6. Applicability of NASA contract quality management and failure mode effect analysis procedures to the USGS Outer Continental Shelf oil and gas lease management program

    NASA Technical Reports Server (NTRS)

    Dyer, M. K.; Little, D. G.; Hoard, E. G.; Taylor, A. C.; Campbell, R.

    1972-01-01

    An approach that might be used for determining the applicability of NASA management techniques to benefit almost any type of down-to-earth enterprise is presented. A study was made to determine the following: (1) the practicality of adopting NASA contractual quality management techniques to the U.S. Geological Survey Outer Continental Shelf lease management function; (2) the applicability of failure mode effects analysis to the drilling, production, and delivery systems in use offshore; (3) the impact on industrial offshore operations and onshore management operations required to apply recommended NASA techniques; and (4) the probable changes required in laws or regulations in order to implement recommendations. Several management activities that have been applied to space programs are identified, and their institution for improved management of offshore and onshore oil and gas operations is recommended.

  7. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  8. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  9. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  10. Short-term effect of aniline on soil microbial activity: a combined study by isothermal microcalorimetry, glucose analysis, and enzyme assay techniques.

    PubMed

    Chen, Huilun; Zhuang, Rensheng; Yao, Jun; Wang, Fei; Qian, Yiguang; Masakorala, Kanaji; Cai, Minmin; Liu, Haijun

    2014-01-01

    The accidents of aniline spill and explosion happened almost every year in China, whereas the toxic effect of aniline on soil microbial activity remained largely unexplored. In this study, isothermal microcalorimetric technique, glucose analysis, and soil enzyme assay techniques were employed to investigate the toxic effect of aniline on microbial activity in Chinese soil for the first time. Soil samples were treated with aniline from 0 to 2.5 mg/g soil to tie in with the fact of aniline spill. Results from microcalorimetric analysis showed that the introduction of aniline had a significant adverse effect on soil microbial activity at the exposure concentrations ≥0.4 mg/g soil (p < 0.05) and ≥0.8 mg/g soil (p < 0.01), and the activity was totally inhibited when the concentration increased to 2.5 mg/g soil. The glucose analysis indicated that aniline significantly decreased the soil microbial respiratory activity at the concentrations ≥0.8 mg/g soil (p < 0.05) and ≥1.5 mg/g soil (p < 0.01). Soil enzyme activities for β-glucosidase, urease, acid-phosphatase, and dehydrogenase revealed that aniline had a significant effect (p < 0.05) on the nutrient cycling of C, N, and P as well as the oxidative capacity of soil microorganisms, respectively. All of these results showed an intensively toxic effect of aniline on soil microbial activity. The proposed methods can provide toxicological information of aniline to soil microbes from the metabolic and biochemical point of views which are consistent with and correlated to each other.

  11. Novel permutation measures for image encryption algorithms

    NASA Astrophysics Data System (ADS)

    Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.

    2016-10-01

    This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.

  12. Simulation/Emulation Techniques: Compressing Schedules With Parallel (HW/SW) Development

    NASA Technical Reports Server (NTRS)

    Mangieri, Mark L.; Hoang, June

    2014-01-01

    NASA has always been in the business of balancing new technologies and techniques to achieve human space travel objectives. NASA's Kedalion engineering analysis lab has been validating and using many contemporary avionics HW/SW development and integration techniques, which represent new paradigms to NASA's heritage culture. Kedalion has validated many of the Orion HW/SW engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, inserting new techniques and skills into the Multi - Purpose Crew Vehicle (MPCV) Orion program. Using contemporary agile techniques, Commercial-off-the-shelf (COTS) products, early rapid prototyping, in-house expertise and tools, and extensive use of simulators and emulators, NASA has achieved cost effective paradigms that are currently serving the Orion program effectively. Elements of long lead custom hardware on the Orion program have necessitated early use of simulators and emulators in advance of deliverable hardware to achieve parallel design and development on a compressed schedule.

  13. Interest rate next-day variation prediction based on hybrid feedforward neural network, particle swarm optimization, and multiresolution techniques

    NASA Astrophysics Data System (ADS)

    Lahmiri, Salim

    2016-02-01

    Multiresolution analysis techniques including continuous wavelet transform, empirical mode decomposition, and variational mode decomposition are tested in the context of interest rate next-day variation prediction. In particular, multiresolution analysis techniques are used to decompose interest rate actual variation and feedforward neural network for training and prediction. Particle swarm optimization technique is adopted to optimize its initial weights. For comparison purpose, autoregressive moving average model, random walk process and the naive model are used as main reference models. In order to show the feasibility of the presented hybrid models that combine multiresolution analysis techniques and feedforward neural network optimized by particle swarm optimization, we used a set of six illustrative interest rates; including Moody's seasoned Aaa corporate bond yield, Moody's seasoned Baa corporate bond yield, 3-Month, 6-Month and 1-Year treasury bills, and effective federal fund rate. The forecasting results show that all multiresolution-based prediction systems outperform the conventional reference models on the criteria of mean absolute error, mean absolute deviation, and root mean-squared error. Therefore, it is advantageous to adopt hybrid multiresolution techniques and soft computing models to forecast interest rate daily variations as they provide good forecasting performance.

  14. Investigation of historical metal objects using Laser Induced Breakdown Spectroscopy (LIBS) technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Kareem, O.; Ghoneim, M.; Harith, M. A.

    2011-09-22

    Analysis of metal objects is a necessary step for establishing an appropriate conservation treatment of an object or to follow up the application's result of the suggested treatments. The main considerations on selecting a method that can be used in investigation and analysis of metal objects are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to the appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal objects. In this study various historical metal objects collected from different museums andmore » excavations in Egypt were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal objects were investigated by other methods such as Scanning Electron Microscope with energy-dispersive x-ray analyzer (SEM-EDX) and X-ray Diffraction (XRD). This study confirms that Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal objects. LIBS analysis can quickly provide information on the qualitative and semi-quantitative elemental content of different metal objects and their characterization and classification. It is practically non-destructive technique with the critical advantage of being applicable in situ, thereby avoiding sampling and sample preparations. It is can be dependable, satisfactory and effective method for low cost study of archaeological and historical metals. But we have to take into consideration that the corrosion of metal leads to material alteration and possible loss of certain metals in the form of soluble salts. Certain corrosion products are known to leach out of the object and therefore, their low content does not necessarily reflect the composition of the metal at the time of the object manufacture. Another point should be taken into consideration that the heterogeneity of a metal alloy object that often result from poor mixing of the different metal alloy composition.There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal objects.« less

  15. Induction motor broken rotor bar fault location detection through envelope analysis of start-up current using Hilbert transform

    NASA Astrophysics Data System (ADS)

    Abd-el-Malek, Mina; Abdelsalam, Ahmed K.; Hassan, Ola E.

    2017-09-01

    Robustness, low running cost and reduced maintenance lead Induction Motors (IMs) to pioneerly penetrate the industrial drive system fields. Broken rotor bars (BRBs) can be considered as an important fault that needs to be early assessed to minimize the maintenance cost and labor time. The majority of recent BRBs' fault diagnostic techniques focus on differentiating between healthy and faulty rotor cage. In this paper, a new technique is proposed for detecting the location of the broken bar in the rotor. The proposed technique relies on monitoring certain statistical parameters estimated from the analysis of the start-up stator current envelope. The envelope of the signal is obtained using Hilbert Transformation (HT). The proposed technique offers non-invasive, fast computational and accurate location diagnostic process. Various simulation scenarios are presented that validate the effectiveness of the proposed technique.

  16. 38 CFR 1.921 - Analysis of costs.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... effectiveness of alternative collection techniques, establish guidelines with respect to points at which costs... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Analysis of costs. 1.921... Standards for Collection of Claims § 1.921 Analysis of costs. VA collection procedures should provide for...

  17. A Review of Meta-Analysis Packages in R

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Hennessy, Emily A.; Tanner-Smith, Emily E.

    2017-01-01

    Meta-analysis is a statistical technique that allows an analyst to synthesize effect sizes from multiple primary studies. To estimate meta-analysis models, the open-source statistical environment R is quickly becoming a popular choice. The meta-analytic community has contributed to this growth by developing numerous packages specific to…

  18. A Comparison of Holistic versus Decomposed Rating of Position Analysis Questionnaire Work Dimensions.

    ERIC Educational Resources Information Center

    Butler, Stephanie K.; Harvey, Robert J.

    1988-01-01

    Examined technique for improving cost-effectiveness of Position Analysis Questionnaire (PAQ) in job analysis. Professional job analysts, industrial psychology graduate students familiar with PAQ, and PAQ-unfamiliar undergraduates made direct holistic ratings of PAQ dimensions for four familiar jobs. Comparison of holistic ratings with decomposed…

  19. Analysis of local delaminations caused by angle ply matrix cracks

    NASA Technical Reports Server (NTRS)

    Salpekar, Satish A.; Obrien, T. Kevin; Shivakumar, K. N.

    1993-01-01

    Two different families of graphite/epoxy laminates with similar layups but different stacking sequences, (0,theta,-theta) sub s and (-theta/theta/0) sub s were analyzed using three-dimensional finite element analysis for theta = 15 and 30 degrees. Delaminations were modeled in the -theta/theta interface, bounded by a matrix crack and the stress free edge. The total strain energy release rate, G, along the delamination front was computed using three different techniques: the virtual crack closure technique (VCCT), the equivalent domain Integral (EDI) technique, and a global energy balance technique. The opening fracture mode component of the strain energy release rate, Gl, along the delamination front was also computed for various delamination lengths using VCCT. The effect of residual thermal and moisture stresses on G was evaluated.

  20. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  1. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  2. Performance Analysis of Garbage Collection and Dynamic Reordering in a Lisp System. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Llames, Rene Lim

    1991-01-01

    Generation based garbage collection and dynamic reordering of objects are two techniques for improving the efficiency of memory management in Lisp and similar dynamic language systems. An analysis of the effect of generation configuration is presented, focusing on the effect of a number of generations and generation capabilities. Analytic timing and survival models are used to represent garbage collection runtime and to derive structural results on its behavior. The survival model provides bounds on the age of objects surviving a garbage collection at a particular level. Empirical results show that execution time is most sensitive to the capacity of the youngest generation. A technique called scanning for transport statistics, for evaluating the effectiveness of reordering independent of main memory size, is presented.

  3. Application of the shifted excitation Raman difference spectroscopy (SERDS) to the analysis of trace amounts of methanol in red wines

    NASA Astrophysics Data System (ADS)

    Volodin, Boris; Dolgy, Sergei; Ban, Vladimir S.; Gracin, Davor; Juraić, Krunoslav; Gracin, Leo

    2014-03-01

    Shifted Excitation Raman Difference Spectroscopy (SERDS) has proven an effective method for performing Raman analysis of fluorescent samples. This technique allows achieving excellent signal to noise performance with shorter excitation wavelengths, thus taking full advantage of the superior signal strength afforded by shorter excitation wavelengths and the superior performance, also combined with lower cost, delivered by silicon CCDs. The technique is enabled by use of two closely space fixed-wavelength laser diode sources stabilized with the Volume Bragg gratings (VBGs). A side by side comparison reveals that SERDS technique delivers superior signal to noise ratio and better detection limits in most situations, even when a longer excitation wavelength is employed for the purpose of elimination of the fluorescence. We have applied the SERDS technique to the quantitative analysis of the presence of trace amounts of methanol in red wines, which is an important task in quality control operations within wine industry and is currently difficult to perform in the field. So far conventional Raman spectroscopy analysis of red wines has been impractical due to the high degree of fluorescence.

  4. [Application of text mining approach to pre-education prior to clinical practice].

    PubMed

    Koinuma, Masayoshi; Koike, Katsuya; Nakamura, Hitoshi

    2008-06-01

    We developed a new survey analysis technique to understand students' actual aims for effective pretraining prior to clinical practice. We asked third-year undergraduate students to write fixed-style complete and free sentences on "preparation of drug dispensing." Then, we converted their sentence data in to text style and performed Japanese-language morphologic analysis on the data using language analysis software. We classified key words, which were created on the basis of the word class information of the Japanese language morphologic analysis, into categories based on causes and characteristics. In addition to this, we classified the characteristics into six categories consisting of those concepts including "knowledge," "skill and attitude," "image," etc. with the KJ method technique. The results showed that the awareness of students of "preparation of drug dispensing" tended to be approximately three-fold more frequent in "skill and attitude," "risk," etc. than in "knowledge." Regarding the characteristics in the category of the "image," words like "hard," "challenging," "responsibility," "life," etc. frequently occurred. The results of corresponding analysis showed that the characteristics of the words "knowledge" and "skills and attitude" were independent. As the result of developing a cause-and-effect diagram, it was demonstrated that the phase "hanging tough" described most of the various factors. We thus could understand students' actual feelings by applying text-mining as a new survey analysis technique.

  5. Edge compression techniques for visualization of dense directed graphs.

    PubMed

    Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher

    2013-12-01

    We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.

  6. Concrete Condition Assessment Using Impact-Echo Method and Extreme Learning Machines

    PubMed Central

    Zhang, Jing-Kui; Yan, Weizhong; Cui, De-Mi

    2016-01-01

    The impact-echo (IE) method is a popular non-destructive testing (NDT) technique widely used for measuring the thickness of plate-like structures and for detecting certain defects inside concrete elements or structures. However, the IE method is not effective for full condition assessment (i.e., defect detection, defect diagnosis, defect sizing and location), because the simple frequency spectrum analysis involved in the existing IE method is not sufficient to capture the IE signal patterns associated with different conditions. In this paper, we attempt to enhance the IE technique and enable it for full condition assessment of concrete elements by introducing advanced machine learning techniques for performing comprehensive analysis and pattern recognition of IE signals. Specifically, we use wavelet decomposition for extracting signatures or features out of the raw IE signals and apply extreme learning machine, one of the recently developed machine learning techniques, as classification models for full condition assessment. To validate the capabilities of the proposed method, we build a number of specimens with various types, sizes, and locations of defects and perform IE testing on these specimens in a lab environment. Based on analysis of the collected IE signals using the proposed machine learning based IE method, we demonstrate that the proposed method is effective in performing full condition assessment of concrete elements or structures. PMID:27023563

  7. Using multiple group modeling to test moderators in meta-analysis.

    PubMed

    Schoemann, Alexander M

    2016-12-01

    Meta-analysis is a popular and flexible analysis that can be fit in many modeling frameworks. Two methods of fitting meta-analyses that are growing in popularity are structural equation modeling (SEM) and multilevel modeling (MLM). By using SEM or MLM to fit a meta-analysis researchers have access to powerful techniques associated with SEM and MLM. This paper details how to use one such technique, multiple group analysis, to test categorical moderators in meta-analysis. In a multiple group meta-analysis a model is fit to each level of the moderator simultaneously. By constraining parameters across groups any model parameter can be tested for equality. Using multiple groups to test for moderators is especially relevant in random-effects meta-analysis where both the mean and the between studies variance of the effect size may be compared across groups. A simulation study and the analysis of a real data set are used to illustrate multiple group modeling with both SEM and MLM. Issues related to multiple group meta-analysis and future directions for research are discussed. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Modelling and multi objective optimization of WEDM of commercially Monel super alloy using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu

    2016-09-01

    In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.

  9. Methodologies for launcher-payload coupled dynamic analysis

    NASA Astrophysics Data System (ADS)

    Fransen, S. H. J. A.

    2012-06-01

    An important step in the design and verification process of spacecraft structures is the coupled dynamic analysis with the launch vehicle in the low-frequency domain, also referred to as coupled loads analysis (CLA). The objective of such analyses is the computation of the dynamic environment of the spacecraft (payload) in terms of interface accelerations, interface forces, center of gravity (CoG) accelerations as well as the internal state of stress. In order to perform an efficient, fast and accurate launcher-payload coupled dynamic analysis, various methodologies have been applied and developed. The methods are related to substructuring techniques, data recovery techniques, the effects of prestress and fluids and time integration problems. The aim of this paper was to give an overview of these methodologies and to show why, how and where these techniques can be used in the process of launcher-payload coupled dynamic analysis. In addition, it will be shown how these methodologies fit together in a library of procedures which can be used with the MSC.Nastran™ solution sequences.

  10. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  11. On-Line Monitoring and Diagnostics of the Integrity of Nuclear Plant Steam Generators and Heat Exchangers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belle R. Upadhyaya; J. Wesley Hines

    2004-09-27

    Integrity monitoring and flaw diagnostics of flat beams and tubular structures was investigated in this research task using guided acoustic signals. A piezo-sensor suite was deployed to activate and collect Lamb wave signals that propagate along metallic specimens. The dispersion curves of Lamb waves along plate and tubular structures are generated through numerical analysis. Several advanced techniques were explored to extract representative features from acoustic time series. Among them, the Hilbert-Huang transform (HHT) is a recently developed technique for the analysis of non-linear and transient signals. A moving window method was introduced to generate the local peak characters from acousticmore » time series, and a zooming window technique was developed to localize the structural flaws. The time-frequency analysis and pattern recognition techniques were combined for classifying structural defects in brass tubes. Several types of flaws in brass tubes were tested, both in the air and in water. The techniques also proved to be effective under background/process noise. A detailed theoretical analysis of Lamb wave propagation was performed and simulations were carried out using the finite element software system ABAQUS. This analytical study confirmed the behavior of the acoustic signals acquired from the experimental studies. The report presents the background the analysis of acoustic signals acquired from piezo-electric transducers for structural defect monitoring. A comparison of the use of time-frequency techniques, including the Hilbert-Huang transform, is presented. The report presents the theoretical study of Lamb wave propagation in flat beams and tubular structures, and the need for mode separation in order to effectively perform defect diagnosis. The results of an extensive experimental study of detection, location, and isolation of structural defects in flat aluminum beams and brass tubes are presented. The results of this research show the feasibility of on-line monitoring of small structural flaws by the use of transient and nonlinear acoustic signal analysis, and its implementation by the proper design of a piezo-electric transducer suite.« less

  12. Effects of implant system, impression technique, and impression material on accuracy of the working cast.

    PubMed

    Wegner, Kerstin; Weskott, Katharina; Zenginel, Martha; Rehmann, Peter; Wöstmann, Bernd

    2013-01-01

    This in vitro study aimed to identify the effects of the implant system, impression technique, and impression material on the transfer accuracy of implant impressions. The null hypothesis tested was that, in vitro and within the parameters of the experiment, the spatial relationship of a working cast to the placement of implants is not related to (1) the implant system, (2) the impression technique, or (3) the impression material. A steel maxilla was used as a reference model. Six implants of two different implant systems (Standard Plus, Straumann; Semados, Bego) were fixed in the reference model. The target variables were: three-dimensional (3D) shift in all directions, implant axis direction, and rotation. The target variables were assessed using a 3D coordinate measuring machine, and the respective deviations of the plaster models from the nominal values of the reference model were calculated. Two different impression techniques (reposition/pickup) and four impression materials (Aquasil Ultra, Flexitime, Impregum Penta, P2 Magnum 360) were investigated. In all, 80 implant impressions for each implant system were taken. Statistical analysis was performed using multivariate analysis of variance. The implant system significantly influenced the transfer accuracy for most spatial dimensions, including the overall 3D shift and implant axis direction. There was no significant difference between the two implant systems with regard to rotation. Multivariate analysis of variance showed a significant effect on transfer accuracy only for the implant system. Within the limits of the present study, it can be concluded that the transfer accuracy of the intraoral implant position on the working cast is far more dependent on the implant system than on the selection of a specific impression technique or material.

  13. The Search for an Effective Clinical Behavior Analysis: The Nonlinear Thinking of Israel Goldiamond

    ERIC Educational Resources Information Center

    Layng, T. V. Joe

    2009-01-01

    This paper has two purposes; the first is to reintroduce Goldiamond's constructional approach to clinical behavior analysis and to the field of behavior analysis as a whole, which, unfortunately, remains largely unaware of his nonlinear functional analysis and its implications. The approach is not simply a set of clinical techniques; instead it…

  14. Propagating Resource Constraints Using Mutual Exclusion Reasoning

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Sanchez, Romeo; Do, Minh B.; Clancy, Daniel (Technical Monitor)

    2001-01-01

    One of the most recent techniques for propagating resource constraints in Constraint Based scheduling is Energy Constraint. This technique focuses in precedence based scheduling, where precedence relations are taken into account rather than the absolute position of activities. Although, this particular technique proved to be efficient on discrete unary resources, it provides only loose bounds for jobs using discrete multi-capacity resources. In this paper we show how mutual exclusion reasoning can be used to propagate time bounds for activities using discrete resources. We show that our technique based on critical path analysis and mutex reasoning is just as effective on unary resources, and also shows that it is more effective on multi-capacity resources, through both examples and empirical study.

  15. ANALYSIS OF METHODS FOR DETECTING THE PROXIMITY EFFECT IN QUASAR SPECTRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Aglio, Aldo; Gnedin, Nickolay Y., E-mail: adaglio@aip.d

    Using numerical simulations of structure formation, we investigate several methods for determining the strength of the proximity effect in the H I Ly{alpha} forest. We analyze three high-resolution ({approx}10 kpc) redshift snapshots (z-bar=4,3, and 2.25) of a Hydro-Particle-Mesh simulation to obtain realistic absorption spectra of the H I Ly{alpha} forest. We model the proximity effect along the simulated sight lines with a simple analytical prescription based on the assumed quasar luminosity and the intensity of the cosmic UV background (UVB). We begin our analysis investigating the intrinsic biases thought to arise in the widely adopted standard technique of combining multiplemore » lines of sight when searching for the proximity effect. We confirm the existence of these biases, albeit smaller than previously predicted with simple Monte Carlo simulations. We then concentrate on the analysis of the proximity effect along individual lines of sight. After determining its strength with a fiducial value of the UVB intensity, we construct the proximity effect strength distribution (PESD). We confirm that the PESD inferred from the simple averaging technique accurately recovers the input strength of the proximity effect at all redshifts. Moreover, the PESD closely follows the behaviors found in observed samples of quasar spectra. However, the PESD obtained from our new simulated sight lines presents some differences to that of simple Monte Carlo simulations. At all redshifts, we find a smaller dispersion of the strength parameters, the source of the corresponding smaller biases found when combining multiple lines of sight. After developing three new theoretical methods for recovering the strength of the proximity effect on individual lines of sight, we compare their accuracy to the PESD from the simple averaging technique. All our new approaches are based on the maximization of the likelihood function, albeit invoking some modifications. The new techniques presented here, in spite of their complexity, fail to recover the input proximity effect in an unbiased way, presumably due to some (unknown) higher order correlations in the spectrum. Thus, employing complex three-dimensional simulations, we provide strong evidence in favor of the PESD obtained from the simple averaging technique, as a method of estimating the UVB intensity, free of any intrinsic biases.« less

  16. Effect of geometrical parameters on pressure distributions of impulse manufacturing technologies

    NASA Astrophysics Data System (ADS)

    Brune, Ryan Carl

    Impulse manufacturing techniques constitute a growing field of methods that utilize high-intensity pressure events to conduct useful mechanical operations. As interest in applying this technology continues to grow, greater understanding must be achieved with respect to output pressure events in both magnitude and distribution. In order to address this need, a novel pressure measurement has been developed called the Profile Indentation Pressure Evaluation (PIPE) method that systematically analyzes indentation patterns created with impulse events. Correlation with quasi-static test data and use of software-assisted analysis techniques allows for colorized pressure maps to be generated for both electromagnetic and vaporizing foil actuator (VFA) impulse forming events. Development of this technique aided introduction of a design method for electromagnetic path actuator systems, where key geometrical variables are considered using a newly developed analysis method, which is called the Path Actuator Proximal Array (PAPA) pressure model. This model considers key current distribution and proximity effects and interprets generated pressure by considering the adjacent conductor surfaces as proximal arrays of individual conductors. According to PIPE output pressure analysis, the PAPA model provides a reliable prediction of generated pressure for path actuator systems as local geometry is changed. Associated mechanical calculations allow for pressure requirements to be calculated for shearing, flanging, and hemming operations, providing a design process for such cases. Additionally, geometry effect is investigated through a formability enhancement study using VFA metalworking techniques. A conical die assembly is utilized with both VFA high velocity and traditional quasi-static test methods on varied Hasek-type sample geometries to elicit strain states consistent with different locations on a forming limit diagram. Digital image correlation techniques are utilized to measure major and minor strains for each sample type to compare limit strain results. Overall testing indicated decreased formability at high velocity for 304 DDQ stainless steel and increased formability at high velocity for 3003-H14 aluminum. Microstructural and fractographic analysis helped dissect and analyze the observed differences in these cases. Overall, these studies comprehensively explore the effects of geometrical parameters on magnitude and distribution of impulse manufacturing generated pressure, establishing key guidelines and models for continued development and implementation in commercial applications.

  17. Language Sample Analysis and Elicitation Technique Effects in Bilingual Children with and without Language Impairment

    ERIC Educational Resources Information Center

    Kapantzoglou, Maria; Fergadiotis, Gerasimos; Restrepo, M. Adelaida

    2017-01-01

    Purpose: This study examined whether the language sample elicitation technique (i.e., storytelling and story-retelling tasks with pictorial support) affects lexical diversity (D), grammaticality (grammatical errors per communication unit [GE/CU]), sentence length (mean length of utterance in words [MLUw]), and sentence complexity (subordination…

  18. UMEL: a new regression tool to identify measurement peaks in LIDAR/DIAL systems for environmental physics applications.

    PubMed

    Gelfusa, M; Gaudio, P; Malizia, A; Murari, A; Vega, J; Richetta, M; Gonzalez, S

    2014-06-01

    Recently, surveying large areas in an automatic way, for early detection of both harmful chemical agents and forest fires, has become a strategic objective of defence and public health organisations. The Lidar and Dial techniques are widely recognized as a cost-effective alternative to monitor large portions of the atmosphere. To maximize the effectiveness of the measurements and to guarantee reliable monitoring of large areas, new data analysis techniques are required. In this paper, an original tool, the Universal Multi Event Locator, is applied to the problem of automatically identifying the time location of peaks in Lidar and Dial measurements for environmental physics applications. This analysis technique improves various aspects of the measurements, ranging from the resilience to drift in the laser sources to the increase of the system sensitivity. The method is also fully general, purely software, and can therefore be applied to a large variety of problems without any additional cost. The potential of the proposed technique is exemplified with the help of data of various instruments acquired during several experimental campaigns in the field.

  19. Production of cromolyn sodium microparticles for aerosol delivery by supercritical assisted atomization.

    PubMed

    Reverchon, Ernesto; Adami, Renata; Caputo, Giuseppe

    2007-12-21

    The purpose of this study was to produce cromolyn sodium (CS) micrometric particles with controlled particle size (PS) and PS distribution (PSD) suitable for aerosol delivery, using a supercritical fluids-based process. CS was micronized using the supercritical assisted atomization (SAA) technique at different solute concentrations in water and different precipitation temperatures. Two techniques were used to measure PS and PSD of produced particles: scanning electron microscopy image analysis and laser scattering analysis. The 2 techniques were compared to provide a complete description of the powder obtained. High-performance liquid chromatography analysis was used to verify the absence of degradation of CS after micronization; differential scanning calorimetry, thermogravimetric analysis (TGA), and X-ray analysis were performed to study the effect of operative conditions on the crystalline structure and on the water content of SAA micronized particles. The CS particles obtained were spherical, with a volumetric percentage of particles with a diameter ranging between 1 and 5 microm of 50% to 66%. The precipitation temperature had no significant effect on PSD, but high drying temperatures led to product degradation. Increasing the concentration of CS in water solution produced an increase in PS of the micronized particles. TGA showed that the micronized CS had a different hydration state than the untreated CS did. The micronized product was stable after 12 months of storage, and no modifications in structure, morphology, or crystallinity were detected. In conclusion, SAA is an efficient technique for micronization of CS, and stable spherical amorphous particles suitable for aerosol delivery can be produced.

  20. Thermal Analysis of Brazing Seal and Sterilizing Technique to Break Contamination Chain for Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph

    2015-01-01

    The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.

  1. Preliminary geological investigation of AIS data at Mary Kathleen, Queensland, Australia

    NASA Technical Reports Server (NTRS)

    Huntington, J. F.; Green, A. A.; Craig, M. D.; Cocks, T. D.

    1986-01-01

    The Airborne Imaging Spectrometer (AIS) was flown over granitic, volcanic, and calc-silicate terrain around the Mary Kathleen Uranium Mine in Queensland, in a test of its mineralocial mapping capabilities. An analysis strategy and restoration and enhancement techniques were developed to process the 128 band AIS data. A preliminary analysis of one of three AIS flight lines shows that the data contains considerable spectral variation but that it is also contaminated by second-order leakage of radiation from the near-infrared region. This makes the recognition of expected spectral absorption shapes very difficult. The effect appears worst in terrains containing considerable vegetation. Techniques that try to predict this supplementary radiation coupled with the log residual analytical technique show that expected mineral absorption spectra can be derived. The techniques suggest that with additional refinement correction procedures, the Australian AIS data may be revised. Application of the log residual analysis method has proved very successful on the cuprite, Nevada data set, and for highlighting the alunite, linite, and SiOH mineralogy.

  2. Atomic spectrometry methods for wine analysis: a critical evaluation and discussion of recent applications.

    PubMed

    Grindlay, Guillermo; Mora, Juan; Gras, Luis; de Loos-Vollebregt, Margaretha T C

    2011-04-08

    The analysis of wine is of great importance since wine components strongly determine its stability, organoleptic or nutrition characteristics. In addition, wine analysis is also important to prevent fraud and to assess toxicological issues. Among the different analytical techniques described in the literature, atomic spectrometry has been traditionally employed for elemental wine analysis due to its simplicity and good analytical figures of merit. The scope of this review is to summarize the main advantages and drawbacks of various atomic spectrometry techniques for elemental wine analysis. Special attention is paid to interferences (i.e. matrix effects) affecting the analysis as well as the strategies available to mitigate them. Finally, latest studies about wine speciation are briefly discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Training Staff Serving Clients with Intellectual Disabilities: A Meta-Analysis of Aspects Determining Effectiveness

    ERIC Educational Resources Information Center

    van Oorsouw, Wietske M. W. J.; Embregts, Petri J. C. M.; Bosman, Anna M. T.; Jahoda, Andrew

    2009-01-01

    The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff…

  4. The Contribution of Human Factors in Military System Development: Methodological Considerations

    DTIC Science & Technology

    1980-07-01

    Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time

  5. The Effect of Unequal Samples, Heterogeneity of Covariance Matrices, and Number of Variables on Discriminant Analysis Classification Tables and Related Statistics.

    ERIC Educational Resources Information Center

    Spearing, Debra; Woehlke, Paula

    To assess the effect on discriminant analysis in terms of correct classification into two groups, the following parameters were systematically altered using Monte Carlo techniques: sample sizes; proportions of one group to the other; number of independent variables; and covariance matrices. The pairing of the off diagonals (or covariances) with…

  6. The economic evaluation of pharmacotherapies for Parkinson's disease.

    PubMed

    Coyle, D; Barbeau, M; Guttman, M; Baladi, J-F

    2003-06-01

    As well as the significant clinical effects of Parkinson's disease (PD), the disease places a high economic burden on society. Given the scarcity of health care resources, it is becoming increasingly necessary to demonstrate that new therapies for PD provide value for money in comparison with other potential interventions. This paper outlines the basic techniques of cost-effectiveness analysis and its application to PD. These techniques are illustrated by a recent economic evaluation of entacapone for use in Canada.

  7. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  8. A simple method for principal strata effects when the outcome has been truncated due to death.

    PubMed

    Chiba, Yasutaka; VanderWeele, Tyler J

    2011-04-01

    In randomized trials with follow-up, outcomes such as quality of life may be undefined for individuals who die before the follow-up is complete. In such settings, restricting analysis to those who survive can give rise to biased outcome comparisons. An alternative approach is to consider the "principal strata effect" or "survivor average causal effect" (SACE), defined as the effect of treatment on the outcome among the subpopulation that would have survived under either treatment arm. The authors describe a very simple technique that can be used to assess the SACE. They give both a sensitivity analysis technique and conditions under which a crude comparison provides a conservative estimate of the SACE. The method is illustrated using data from the ARDSnet (Acute Respiratory Distress Syndrome Network) clinical trial comparing low-volume ventilation and traditional ventilation methods for individuals with acute respiratory distress syndrome.

  9. An Effective Technique for Enhancing an Intrauterine Catheter Fetal Electrocardiogram

    NASA Astrophysics Data System (ADS)

    Horner, Steven L.; Holls, William M.

    2003-12-01

    Physician can obtain fetal heart rate, electrophysiological information, and uterine contraction activity for determining fetal status from an intrauterine catheters electrocardiogram with the maternal electrocardiogram canceled. In addition, the intrauterine catheter would allow physicians to acquire fetal status with one non-invasive to the fetus biosensor as compared to invasive to the fetus scalp electrode and intrauterine pressure catheter used currently. A real-time maternal electrocardiogram cancellation technique of the intrauterine catheters electrocardiogram will be discussed along with an analysis for the methods effectiveness with synthesized and clinical data. The positive results from an original detailed subjective and objective analysis of synthesized and clinical data clearly indicate that the maternal electrocardiogram cancellation method was found to be effective. The resulting intrauterine catheters electrocardiogram from effectively canceling the maternal electrocardiogram could be used for determining fetal heart rate, fetal electrocardiogram electrophysiological information, and uterine contraction activity.

  10. Procedure for detection and measurement of interfaces in remotely acquired data using a digital computer

    NASA Technical Reports Server (NTRS)

    Faller, K. H.

    1976-01-01

    A technique for the detection and measurement of surface feature interfaces in remotely acquired data was developed and evaluated. A computer implementation of this technique was effected to automatically process classified data derived from various sources such as the LANDSAT multispectral scanner and other scanning sensors. The basic elements of the operational theory of the technique are described, followed by the details of the procedure. An example of an application of the technique to the analysis of tidal shoreline length is given with a breakdown of manpower requirements.

  11. Comparison of data inversion techniques for remotely sensed wide-angle observations of Earth emitted radiation

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1981-01-01

    The shape factor, parameter estimation, and deconvolution data analysis techniques were applied to the same set of Earth emitted radiation measurements to determine the effects of different techniques on the estimated radiation field. All three techniques are defined and their assumptions, advantages, and disadvantages are discussed. Their results are compared globally, zonally, regionally, and on a spatial spectrum basis. The standard deviations of the regional differences in the derived radiant exitance varied from 7.4 W-m/2 to 13.5 W-m/2.

  12. What are the most effective intervention techniques for changing physical activity self-efficacy and physical activity behaviour--and are they the same?

    PubMed

    Williams, S L; French, D P

    2011-04-01

    There is convincing evidence that targeting self-efficacy is an effective means of increasing physical activity. However, evidence concerning which are the most effective techniques for changing self-efficacy and thereby physical activity is lacking. The present review aims to estimate the association between specific intervention techniques used in physical activity interventions and change obtained in both self-efficacy and physical activity behaviour. A systematic search yielded 27 physical activity intervention studies for 'healthy' adults that reported self-efficacy and physical activity data. A small, yet significant (P < 0.01) effect of the interventions was found on change in self-efficacy and physical activity (d = 0.16 and 0.21, respectively). When a technique was associated with a change in effect sizes for self-efficacy, it also tended to be associated with a change (r(s) = 0.690, P < 0.001) in effect size for physical activity. Moderator analyses found that 'action planning', 'provide instruction' and 'reinforcing effort towards behaviour' were associated with significantly higher levels of both self-efficacy and physical activity. 'Relapse prevention' and 'setting graded tasks' were associated with significantly lower self-efficacy and physical activity levels. This meta-analysis provides evidence for which psychological techniques are most effective for changing self-efficacy and physical activity.

  13. NO TIME FOR DEAD TIME: TIMING ANALYSIS OF BRIGHT BLACK HOLE BINARIES WITH NuSTAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bachetti, Matteo; Barret, Didier; Harrison, Fiona A.

    Timing of high-count-rate sources with the NuSTAR Small Explorer Mission requires specialized analysis techniques. NuSTAR was primarily designed for spectroscopic observations of sources with relatively low count rates rather than for timing analysis of bright objects. The instrumental dead time per event is relatively long (∼2.5 msec) and varies event-to-event by a few percent. The most obvious effect is a distortion of the white noise level in the power density spectrum (PDS) that cannot be easily modeled with standard techniques due to the variable nature of the dead time. In this paper, we show that it is possible to exploitmore » the presence of two completely independent focal planes and use the cospectrum, the real part of the cross PDS, to obtain a good proxy of the white-noise-subtracted PDS. Thereafter, one can use a Monte Carlo approach to estimate the remaining effects of dead time, namely, a frequency-dependent modulation of the variance and a frequency-independent drop of the sensitivity to variability. In this way, most of the standard timing analysis can be performed, albeit with a sacrifice in signal-to-noise ratio relative to what would be achieved using more standard techniques. We apply this technique to NuSTAR observations of the black hole binaries GX 339–4, Cyg X-1, and GRS 1915+105.« less

  14. Relationships between autofocus methods for SAR and self-survey techniques for SONAR. [Synthetic Aperture Radar (SAR)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahl, D.E.; Jakowatz, C.V. Jr.; Ghiglia, D.C.

    1991-01-01

    Autofocus methods in SAR and self-survey techniques in SONAR have a common mathematical basis in that they both involve estimation and correction of phase errors introduced by sensor position uncertainties. Time delay estimation and correlation methods have been shown to be effective in solving the self-survey problem for towed SONAR arrays. Since it can be shown that platform motion errors introduce similar time-delay estimation problems in SAR imaging, the question arises as to whether such techniques could be effectively employed for autofocus of SAR imagery. With a simple mathematical model for motion errors in SAR, we will show why suchmore » correlation/time-delay techniques are not nearly as effective as established SAR autofocus algorithms such as phase gradient autofocus or sub-aperture based methods. This analysis forms an important bridge between signal processing methodologies for SAR and SONAR. 5 refs., 4 figs.« less

  15. AIDS Education for Tanzanian Youth: A Mediation Analysis

    ERIC Educational Resources Information Center

    Stigler, Melissa H.; Kugler, K. C.; Komro, K. A.; Leshabari, M. T.; Klepp, K. I.

    2006-01-01

    Mediation analysis is a statistical technique that can be used to identify mechanisms by which intervention programs achieve their effects. This paper presents the results of a mediation analysis of Ngao, an acquired immunodeficiency syndrome (AIDS) education program that was implemented with school children in Grades 6 and 7 in Tanzania in the…

  16. Simulating the Effects of Common and Specific Abilities on Test Performance: An Evaluation of Factor Analysis

    ERIC Educational Resources Information Center

    McFarland, Dennis J.

    2014-01-01

    Purpose: Factor analysis is a useful technique to aid in organizing multivariate data characterizing speech, language, and auditory abilities. However, knowledge of the limitations of factor analysis is essential for proper interpretation of results. The present study used simulated test scores to illustrate some characteristics of factor…

  17. Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique

    NASA Astrophysics Data System (ADS)

    Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.

    2017-10-01

    At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.

  18. Arthropod Surveillance Programs: Basic Components, Strategies, and Analysis

    PubMed Central

    Rochon, Kateryn; Duehl, Adrian J.; Anderson, John F.; Barrera, Roberto; Su, Nan-Yao; Gerry, Alec C.; Obenauer, Peter J.; Campbell, James F.; Lysyk, Tim J.; Allan, Sandra A.

    2015-01-01

    Effective entomological surveillance planning stresses a careful consideration of methodology, trapping technologies, and analysis techniques. Herein, the basic principles and technological components of arthropod surveillance plans are described, as promoted in the symposium “Advancements in arthropod monitoring technology, techniques, and analysis” presented at the 58th annual meeting of the Entomological Society of America in San Diego, CA. Interdisciplinary examples of arthropod monitoring for urban, medical, and veterinary applications are reviewed. Arthropod surveillance consists of the three components: 1) sampling method, 2) trap technology, and 3) analysis technique. A sampling method consists of selecting the best device or collection technique for a specific location and sampling at the proper spatial distribution, optimal duration, and frequency to achieve the surveillance objective. Optimized sampling methods are discussed for several mosquito species (Diptera: Culicidae) and ticks (Acari: Ixodidae). The advantages and limitations of novel terrestrial and aerial insect traps, artificial pheromones and kairomones are presented for the capture of red flour beetle (Coleoptera: Tenebrionidae), small hive beetle (Coleoptera: Nitidulidae), bed bugs (Hemiptera: Cimicidae), and Culicoides (Diptera: Ceratopogonidae) respectively. After sampling, extrapolating real world population numbers from trap capture data are possible with the appropriate analysis techniques. Examples of this extrapolation and action thresholds are given for termites (Isoptera: Rhinotermitidae) and red flour beetles. PMID:26543242

  19. Training staff serving clients with intellectual disabilities: a meta-analysis of aspects determining effectiveness.

    PubMed

    van Oorsouw, Wietske M W J; Embregts, Petri J C M; Bosman, Anna M T; Jahoda, Andrew

    2009-01-01

    The last decades have seen increased emphasis on the quality of training for direct-care staff serving people with intellectual disabilities. Nevertheless, it is unclear what the key aspects of effective training are. Therefore, the aim of the present meta-analysis was to establish the ingredients (i.e., goals, format, and techniques) for staff training that are related to improvements of staff behaviour. Our literature search concentrated on studies that were published in a period of 20 years. Fifty-five studies met the criteria, resulting in 502 single-subject designs and 13 n>1 designs. Results revealed important information relevant to further improvement of clinical practice: (a) the combination of in-service with coaching-on-the-job is the most powerful format, (b) in in-service formats, one should use multiple techniques, and verbal feedback is particularly recommended, and (c) in coaching-on-the-job formats, verbal feedback should be part of the program, as well as praise and correction. To maximize effectiveness, program developers should carefully prepare training goals, training format, and training techniques, which will yield a profit for clinical practice.

  20. Cogging Torque Reduction Techniques for Spoke-type IPMSM

    NASA Astrophysics Data System (ADS)

    Bahrim, F. S.; Sulaiman, E.; Kumar, R.; Jusoh, L. I.

    2017-08-01

    A spoke-type interior permanent magnet synchronous motor (IPMSM) is extending its tentacles in industrial arena due to good flux-weakening capability and high power density. In many of the application, high strength of permanent magnet causes the undesirable effects of high cogging torque that can aggravate performance of the motor. High cogging torque is significantly produced by IPMSM due to the similar length and the effectiveness of the magnetic air-gap. The address of this study is to analyze and compare the cogging torque effect and performance of four common techniques for cogging torque reduction such as skewing, notching, pole pairing and rotor pole pairing. With the aid of 3-D finite element analysis (FEA) by JMAG software, a 6S-4P Spoke-type IPMSM with various rotor-PM configurations has been designed. As a result, the cogging torque effect reduced up to 69.5% for skewing technique, followed by 31.96%, 29.6%, and 17.53% by pole pairing, axial pole pairing and notching techniques respectively.

  1. Efficacy of metacognitive therapy in improving mental health: A meta-analysis of single-case studies.

    PubMed

    Rochat, Lucien; Manolov, Rumen; Billieux, Joël

    2018-06-01

    Metacognitive therapy and one of its treatment components, the attention training technique, are increasingly being delivered to improve mental health. We examined the efficacy of metacognitive therapy and/or attention training technique on mental health outcomes from single-case studies. A total of 14 studies (53 patients) were included. We used the d-statistic for multiple baseline data and the percentage change index to compute the effect sizes. Metacognitive therapy has a large effect on depression, anxiety, other psychopathological symptoms, and all outcomes together. Effect sizes were significantly moderated by the number of sessions, the severity and duration of symptoms, and patient gender, but not by study quality or attention training technique when used as a stand-alone treatment. At the follow-up, 77.36% of the individuals were considered recovered or had maintained improvement. Metacognitive therapy and attention training technique strongly contribute to improving mental health outcomes. This study effectively informs evidence-based practice in the clinical milieu. © 2017 Wiley Periodicals, Inc.

  2. Advanced fitness landscape analysis and the performance of memetic algorithms.

    PubMed

    Merz, Peter

    2004-01-01

    Memetic algorithms (MAs) have demonstrated very effective in combinatorial optimization. This paper offers explanations as to why this is so by investigating the performance of MAs in terms of efficiency and effectiveness. A special class of MAs is used to discuss efficiency and effectiveness for local search and evolutionary meta-search. It is shown that the efficiency of MAs can be increased drastically with the use of domain knowledge. However, effectiveness highly depends on the structure of the problem. As is well-known, identifying this structure is made easier with the notion of fitness landscapes: the local properties of the fitness landscape strongly influence the effectiveness of the local search while the global properties strongly influence the effectiveness of the evolutionary meta-search. This paper also introduces new techniques for analyzing the fitness landscapes of combinatorial problems; these techniques focus on the investigation of random walks in the fitness landscape starting at locally optimal solutions as well as on the escape from the basins of attractions of current local optima. It is shown for NK-landscapes and landscapes of the unconstrained binary quadratic programming problem (BQP) that a random walk to another local optimum can be used to explain the efficiency of recombination in comparison to mutation. Moreover, the paper shows that other aspects like the size of the basins of attractions of local optima are important for the efficiency of MAs and a local search escape analysis is proposed. These simple analysis techniques have several advantages over previously proposed statistical measures and provide valuable insight into the behaviour of MAs on different kinds of landscapes.

  3. Practical guidance for conducting mediation analysis with multiple mediators using inverse odds ratio weighting.

    PubMed

    Nguyen, Quynh C; Osypuk, Theresa L; Schmidt, Nicole M; Glymour, M Maria; Tchetgen Tchetgen, Eric J

    2015-03-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994-2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. The GMAO OSSE for Weather Analysis and Prediction Using the High-Resolution GEOS-5 Nature Run

    NASA Technical Reports Server (NTRS)

    Errico, Ronald; Prive, Nikki; Da Silva Carvalho, David

    2017-01-01

    Applications of OSSEs:1. Estimate effects of proposed instruments (and their competing designs)on analysis skill by exploiting simulated environment, and 2. Evaluate present and proposed techniques for data assimilation by exploiting known truth.

  5. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  6. Interactive Image Analysis System Design,

    DTIC Science & Technology

    1982-12-01

    This report describes a design for an interactive image analysis system (IIAS), which implements terrain data extraction techniques. The design... analysis system. Additionally, the system is fully capable of supporting many generic types of image analysis and data processing, and is modularly...employs commercially available, state of the art minicomputers and image display devices with proven software to achieve a cost effective, reliable image

  7. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  8. Flux control coefficients determined by inhibitor titration: the design and analysis of experiments to minimize errors.

    PubMed Central

    Small, J R

    1993-01-01

    This paper is a study into the effects of experimental error on the estimated values of flux control coefficients obtained using specific inhibitors. Two possible techniques for analysing the experimental data are compared: a simple extrapolation method (the so-called graph method) and a non-linear function fitting method. For these techniques, the sources of systematic errors are identified and the effects of systematic and random errors are quantified, using both statistical analysis and numerical computation. It is shown that the graph method is very sensitive to random errors and, under all conditions studied, that the fitting method, even under conditions where the assumptions underlying the fitted function do not hold, outperformed the graph method. Possible ways of designing experiments to minimize the effects of experimental errors are analysed and discussed. PMID:8257434

  9. Two biased estimation techniques in linear regression: Application to aircraft

    NASA Technical Reports Server (NTRS)

    Klein, Vladislav

    1988-01-01

    Several ways for detection and assessment of collinearity in measured data are discussed. Because data collinearity usually results in poor least squares estimates, two estimation techniques which can limit a damaging effect of collinearity are presented. These two techniques, the principal components regression and mixed estimation, belong to a class of biased estimation techniques. Detection and assessment of data collinearity and the two biased estimation techniques are demonstrated in two examples using flight test data from longitudinal maneuvers of an experimental aircraft. The eigensystem analysis and parameter variance decomposition appeared to be a promising tool for collinearity evaluation. The biased estimators had far better accuracy than the results from the ordinary least squares technique.

  10. Infrared evaluation of the heat-sink bipolar diathermy dissection technique.

    PubMed

    Allan, J; Dusseldorp, J; Rabey, N G; Malata, C M; Goltsman, D; Phoon, A F

    2015-08-01

    The use of the bipolar diathermy dissection technique is widespread amongst surgeons performing flap perforator dissection and microvascular surgery. The 'heat-sink' modification uses a DeBakey forcep as a heat sinking interposition between the bipolar tip and the main (vascular or flap) pedicle aiming to protect it from the thermal effects of the bipolar diathermy. This study examines the thermal effects of bipolar cautery upon the microvasculature and investigates the efficacy of heat sinking as a thermally protective technique in microsurgical dissection. A chicken thigh microsurgical training model was used to examine the effects of bipolar cautery. The effects of bipolar were examined using high definition, real-time infrared thermographic imaging (FLIR Systems) and temperature quantitatively assessed at various distances away from the point of bipolar cautery. Comparison was made using the heat sink technique to determine if it conferred a thermoprotective effect compared to the standard technique without heat sink. Using paired t-test analysis (SPSS) the heat sink modification of the bipolar dissection technique was found to have a highly statistically significant effect (P < 0.000000001) in reducing the conductive temperature along the vascular pedicle. This protective effect kept temperatures comparable to controls. Bipolar cautery is an extremely safe method of electrosurgery, however when its use is required within 3 mm of important vascular architecture, the heat-sink method is a viable and easy technique to prevent thermal spread and limit potential coagulopathic changes. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  11. Sentence Similarity Analysis with Applications in Automatic Short Answer Grading

    ERIC Educational Resources Information Center

    Mohler, Michael A. G.

    2012-01-01

    In this dissertation, I explore unsupervised techniques for the task of automatic short answer grading. I compare a number of knowledge-based and corpus-based measures of text similarity, evaluate the effect of domain and size on the corpus-based measures, and also introduce a novel technique to improve the performance of the system by integrating…

  12. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  13. QuEChERS, a sample preparation technique that is “catching on”: an up-to-date interview with its inventors

    USDA-ARS?s Scientific Manuscript database

    The technique of QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe) is only 7 years old, yet it is revolutionizing the manner in which multiresidue, multiclass pesticide analysis (and perhaps beyond) is performed. Columnist Ron Majors sits down with inventors Steve Lehotay and Michelangelo An...

  14. Spectroscopic determination of anthraquinone in kraft pulping liquors using a membrane interface

    Treesearch

    X.S. Chai; X.T. Yang; Q.X. Hou; J.Y. Zhu; L.-G. Danielsson

    2003-01-01

    A spectroscopic technique for determining AQ in pulping liquor was developed to effectively separate AQ from dissolved lignin. This technique is based on a flow analysis system with a Nafion membrane interface. The AQ passed through the membrane is converted into its reduced form, AHQ, using sodium hydrosulfite. AHQ has distinguished absorption characteristics in the...

  15. Non-destructive Analysis Reveals Effect of Installation Details on Plywood Siding Performance

    Treesearch

    Christopher G. Hunt; Gregory T. Schueneman; Steven Lacher; Xiping Wang; R. Sam Williams

    2015-01-01

    This study evaluated the influence of a variety of construction techniques on the performance of plywood siding and the applied paint, using both ultrasound and conventional visual inspection techniques. The impact of bottom edge contact, flashing vs. caulking board ends, priming the bottom edge, location (Wisconsin vs. Mississippi) and a gap behind the siding to...

  16. A human factors analysis of EVA time requirements

    NASA Technical Reports Server (NTRS)

    Pate, D. W.

    1996-01-01

    Human Factors Engineering (HFE), also known as Ergonomics, is a discipline whose goal is to engineer a safer, more efficient interface between humans and machines. HFE makes use of a wide range of tools and techniques to fulfill this goal. One of these tools is known as motion and time study, a technique used to develop time standards for given tasks. A human factors motion and time study was initiated with the goal of developing a database of EVA task times and a method of utilizing the database to predict how long an ExtraVehicular Activity (EVA) should take. Initial development relied on the EVA activities performed during the STS-61 mission (Hubble repair). The first step of the analysis was to become familiar with EVAs and with the previous studies and documents produced on EVAs. After reviewing these documents, an initial set of task primitives and task time modifiers was developed. Videotaped footage of STS-61 EVAs were analyzed using these primitives and task time modifiers. Data for two entire EVA missions and portions of several others, each with two EVA astronauts, was collected for analysis. Feedback from the analysis of the data will be used to further refine the primitives and task time modifiers used. Analysis of variance techniques for categorical data will be used to determine which factors may, individually or by interactions, effect the primitive times and how much of an effect they have.

  17. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  18. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  19. Analysis of Thick Sandwich Shells with Embedded Ceramic Tiles

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.; Smith, C.; Lumban-Tobing, F.

    1996-01-01

    The Composite Armored Vehicle (CAV) is an advanced technology demonstrator of an all-composite ground combat vehicle. The CAV upper hull is made of a tough light-weight S2-glass/epoxy laminate with embedded ceramic tiles that serve as armor. The tiles are bonded to a rubber mat with a carefully selected, highly viscoelastic adhesive. The integration of armor and structure offers an efficient combination of ballistic protection and structural performance. The analysis of this anisotropic construction, with its inherent discontinuous and periodic nature, however, poses several challenges. The present paper describes a shell-based 'element-layering' technique that properly accounts for these effects and for the concentrated transverse shear flexibility in the rubber mat. One of the most important advantages of the element-layering technique over advanced higher-order elements is that it is based on conventional elements. This advantage allows the models to be portable to other structural analysis codes, a prerequisite in a program that involves the computational facilities of several manufacturers and government laboratories. The element-layering technique was implemented into an auto-layering program that automatically transforms a conventional shell model into a multi-layered model. The effects of tile layer homogenization, tile placement patterns, and tile gap size on the analysis results are described.

  20. SU-F-T-380: Comparing the Effect of Respiration On Dose Distribution Between Conventional Tangent Pair and IMRT Techniques for Adjuvant Radiotherapy in Early Stage Breast Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M; Ramaseshan, R

    2016-06-15

    Purpose: In this project, we compared the conventional tangent pair technique to IMRT technique by analyzing the dose distribution. We also investigated the effect of respiration on planning target volume (PTV) dose coverage in both techniques. Methods: In order to implement IMRT technique a template based planning protocol, dose constrains and treatment process was developed. Two open fields with optimized field weights were combined with two beamlet optimization fields in IMRT plans. We compared the dose distribution between standard tangential pair and IMRT. The improvement in dose distribution was measured by parameters such as conformity index, homogeneity index and coveragemore » index. Another end point was the IMRT technique will reduce the planning time for staff. The effect of patient’s respiration on dose distribution was also estimated. The four dimensional computed tomography (4DCT) for different phase of breathing cycle was used to evaluate the effect of respiration on IMRT planned dose distribution. Results: We have accumulated 10 patients that acquired 4DCT and planned by both techniques. Based on the preliminary analysis, the dose distribution in IMRT technique was better than conventional tangent pair technique. Furthermore, the effect of respiration in IMRT plan was not significant as evident from the 95% isodose line coverage of PTV drawn on all phases of 4DCT. Conclusion: Based on the 4DCT images, the breathing effect on dose distribution was smaller than what we expected. We suspect that there are two reasons. First, the PTV movement due to respiration was not significant. It might be because we used a tilted breast board to setup patients. Second, the open fields with optimized field weights in IMRT technique might reduce the breathing effect on dose distribution. A further investigation is necessary.« less

  1. Polarization-based material classification technique using passive millimeter-wave polarimetric imagery.

    PubMed

    Hu, Fei; Cheng, Yayun; Gui, Liangqi; Wu, Liang; Zhang, Xinyi; Peng, Xiaohui; Su, Jinlong

    2016-11-01

    The polarization properties of thermal millimeter-wave emission capture inherent information of objects, e.g., material composition, shape, and surface features. In this paper, a polarization-based material-classification technique using passive millimeter-wave polarimetric imagery is presented. Linear polarization ratio (LPR) is created to be a new feature discriminator that is sensitive to material type and to remove the reflected ambient radiation effect. The LPR characteristics of several common natural and artificial materials are investigated by theoretical and experimental analysis. Based on a priori information about LPR characteristics, the optimal range of incident angle and the classification criterion are discussed. Simulation and measurement results indicate that the presented classification technique is effective for distinguishing between metals and dielectrics. This technique suggests possible applications for outdoor metal target detection in open scenes.

  2. Thyroid Radiofrequency Ablation: Updates on Innovative Devices and Techniques

    PubMed Central

    Park, Hye Sun; Park, Auh Whan; Chung, Sae Rom; Choi, Young Jun; Lee, Jeong Hyun

    2017-01-01

    Radiofrequency ablation (RFA) is a well-known, effective, and safe method for treating benign thyroid nodules and recurrent thyroid cancers. Thyroid-dedicated devices and basic techniques for thyroid RFA were introduced by the Korean Society of Thyroid Radiology (KSThR) in 2012. Thyroid RFA has now been adopted worldwide, with subsequent advances in devices and techniques. To optimize the treatment efficacy and patient safety, understanding the basic and advanced RFA techniques and selecting the optimal treatment strategy are critical. The goal of this review is to therefore provide updates and analysis of current devices and advanced techniques for RFA treatment of benign thyroid nodules and recurrent thyroid cancers. PMID:28670156

  3. [The actual approaches to the economic analysis of effectiveness of functioning of multi-profile curative preventive medical institution].

    PubMed

    Gaĭdarov, G M; Alekseeva, N Iu; Latysheva, E A

    2010-01-01

    The article deals with the technique of economic analysis of effectiveness of functioning of multi-profile curative preventive medical institution in the conditions of transition to the payment according the completed case of treatment. The necessity of the measures targeted to prevent the financial losses under the new form of payment for hospital care is proved.

  4. Life Cycle Costing: A Working Level Approach

    DTIC Science & Technology

    1981-06-01

    Effects Analysis ( FMEA ) ...... ................ .. 59 Logistics Performance Factors (LPFs) 60 Planning the Use of Life Cycle Cost in the Demonstration...form. Failure Mode and Effects Analysis ( FMEA ). Description. FMEA is a technique that attempts to improve the design of any particular unit. The FMEA ...failure modes and also eliminate extra parts or ones that are used to achieve more performance than is necessary (16:5-14]. Advantages. FMEA forces

  5. Growth and surface analysis of SiO2 on 4H-SiC for MOS devices

    NASA Astrophysics Data System (ADS)

    Kodigala, Subba Ramaiah; Chattopadhyay, Somnath; Overton, Charles; Ardoin, Ira; Gordon, B. J.; Johnstone, D.; Roy, D.; Barone, D.

    2015-03-01

    The SiO2 layers have been grown onto C-face and Si-face 4H-SiC substrates by two different techniques such as wet thermal oxidize process and sputtering. The deposition recipes of these techniques are carefully optimized by trails and error method. The growth effects of SiO2 on the C-face and Si-face 4H-SiC substrates are thoroughly investigated by AFM analysis. The growth mechanism of different species involved in the growth process of SiO2 by wet thermal oxide is now proposed by adopting two body classical projectile scattering. This mechanism drives to determine growth of secondary phases such as α-CH nano-islands in the grown SiO2 layer. The effect of HF etchings on the SiO2 layers grown by both techniques and on both the C-face and Si-face substrates are legitimately studied. The thicknesses of the layers determined by AFM and ellipsometry techniques are widely promulgated. The MOS capacitors are made on the Si-face 4H-SiC wafers by wet oxidation and sputtering processes, which are studied by capacitance versus voltage (CV) technique. From CV measurements, the density of trap states with variation of trap level for MOS devices is estimated.

  6. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  7. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  8. NASA/ASEE Summer Faculty Fellowship Program, 1990, Volume 1

    NASA Technical Reports Server (NTRS)

    Bannerot, Richard B. (Editor); Goldstein, Stanley H. (Editor)

    1990-01-01

    The 1990 Johnson Space Center (JSC) NASA/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program was conducted by the University of Houston-University Park and JSC. A compilation of the final reports on the research projects are presented. The topics covered include: the Space Station; the Space Shuttle; exobiology; cell biology; culture techniques; control systems design; laser induced fluorescence; spacecraft reliability analysis; reduced gravity; biotechnology; microgravity applications; regenerative life support systems; imaging techniques; cardiovascular system; physiological effects; extravehicular mobility units; mathematical models; bioreactors; computerized simulation; microgravity simulation; and dynamic structural analysis.

  9. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis.

    PubMed

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% -CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons' preferences and the patients' conditions.

  10. Effect of various putty-wash impression techniques on marginal fit of cast crowns.

    PubMed

    Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael

    2013-01-01

    Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.

  11. Bring It to the Pitch: Combining Video and Movement Data to Enhance Team Sport Analysis.

    PubMed

    Stein, Manuel; Janetzko, Halldor; Lamprecht, Andreas; Breitkreutz, Thorsten; Zimmermann, Philipp; Goldlucke, Bastian; Schreck, Tobias; Andrienko, Gennady; Grossniklaus, Michael; Keim, Daniel A

    2018-01-01

    Analysts in professional team sport regularly perform analysis to gain strategic and tactical insights into player and team behavior. Goals of team sport analysis regularly include identification of weaknesses of opposing teams, or assessing performance and improvement potential of a coached team. Current analysis workflows are typically based on the analysis of team videos. Also, analysts can rely on techniques from Information Visualization, to depict e.g., player or ball trajectories. However, video analysis is typically a time-consuming process, where the analyst needs to memorize and annotate scenes. In contrast, visualization typically relies on an abstract data model, often using abstract visual mappings, and is not directly linked to the observed movement context anymore. We propose a visual analytics system that tightly integrates team sport video recordings with abstract visualization of underlying trajectory data. We apply appropriate computer vision techniques to extract trajectory data from video input. Furthermore, we apply advanced trajectory and movement analysis techniques to derive relevant team sport analytic measures for region, event and player analysis in the case of soccer analysis. Our system seamlessly integrates video and visualization modalities, enabling analysts to draw on the advantages of both analysis forms. Several expert studies conducted with team sport analysts indicate the effectiveness of our integrated approach.

  12. Leadership for the 1970’s

    DTIC Science & Technology

    1971-07-01

    Under Himt Worked Up To Their Capabilities. He Let The Members Of His Unit Know, What Was Expected Of Them., He Communicated Effectively With His...improved and more effective techniques in the fields of organization analysis, posi- tioa analysis, and personnel placement. The work is still research and... effectiveness 4 of the Army. The central theme of our study is that both the Army and the soldier must see themselves as parties to an informal contract. In this

  13. Design methodology analysis: design and operational energy studies in a new high-rise office building. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-02-01

    Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.

  14. The palisade cartilage tympanoplasty technique: a systematic review and meta-analysis.

    PubMed

    Jeffery, Caroline C; Shillington, Cameron; Andrews, Colin; Ho, Allan

    2017-06-17

    Tympanoplasty is a common procedure performed by Otolaryngologists. Many types of autologous grafts have been used with variations of techniques with varying results. This is the first systematic review of the literature and meta-analysis with the aim to evaluate the effectiveness of one of the techniques which is gaining popularity, the palisade cartilage tympanoplasty. PubMed, EMBASE, and Cochrane databases were searched for "palisade", "cartilage", "tympanoplasty", "perforation" and their synonyms. In total, 199 articles reporting results of palisade cartilage tympanoplasty were identified. Five articles satisfied the following inclusion criteria: adult patients, minimum 6 months follow-up, hearing and surgical outcomes reported. Studies with patients undergoing combined mastoidectomy, ossicular chain reconstruction, and/or other middle ear surgery were excluded. Perforation closure, rate of complications, and post-operative pure-tone average change were extracted for pooled analysis. Study failure and complication proportions that were used to generate odds ratios were pooled. Fixed effects and random effects weightings were generated. The resulting pooled odds ratios are reported. Palisade cartilage tympanoplasty has an overall take rate of 96% at beyond 6 months and has similar odds of complications compared to temporalis fascia (OR 0.89, 95% CI 0.62, 1.30). The air-bone gap closure is statistically similar to reported results from temporalis fascia tympanoplasty. Cartilage palisade tympanoplasty offers excellent graft take rates and good postoperative hearing outcomes for perforations of various sizes and for both primary and revision cases. This technique has predictable, long-term results with low complication rates, similar to temporalis fascia tympanoplasty.

  15. Do Self-Incentives and Self-Rewards Change Behavior? A Systematic Review and Meta-Analysis.

    PubMed

    Brown, Emma M; Smith, Debbie M; Epton, Tracy; Armitage, Christopher J

    2018-01-01

    Encouraging people to self-incentivize (i.e., to reward themselves in the future if they are successful in changing their behavior) or self-reward (i.e., prompt people to reward themselves once they have successfully changed their behavior) are techniques that are frequently embedded within complex behavior change interventions. However, it is not clear whether self-incentives or self-rewards per se are effective at bringing about behavior change. Nine databases were searched alongside manual searching of systematic reviews and online research registers. One thousand four hundred papers were retrieved, spanning a range of behaviors, though the majority of included papers were in the domain of "health psychology". Ten studies matched the inclusion criteria for self-incentive but no studies were retrieved for self-reward. The present systematic review and meta-analysis is therefore the first to evaluate the unique effect of self-incentives on behavior change. Effect sizes were retrieved from 7 of the 10 studies. Analysis of the 7 studies produced a very small pooled effect size for self-incentives (k = 7, N = 1,161), which was statistically significant, d + = 0.17, CI [0.06, 0.29]. The weak effect size and dearth of studies raises the question of why self-incentivizing is such a widely employed component of behavior change interventions. The present research opens up a new field of inquiry to establish: (a) whether or not self-incentivizing and self-rewarding are effective behavior change techniques, (b) whether self-incentives and self-rewards need to be deployed alongside other behavior change techniques, and, (c) when and for whom self-incentives and self-rewards could support effective behavior change. Copyright © 2017. Published by Elsevier Ltd.

  16. The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions

    NASA Astrophysics Data System (ADS)

    Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.

    2018-04-01

    The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.

  17. BATSE analysis techniques for probing the GRB spatial and luminosity distributions

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Meegan, Charles A.

    1992-01-01

    The Burst And Transient Source Experiment (BATSE) has measured homogeneity and isotropy parameters from an increasingly large sample of observed gamma-ray bursts (GRBs), while also maintaining a summary of the way in which the sky has been sampled. Measurement of both of these are necessary for any study of the BATSE data statistically, as they take into account the most serious observational selection effects known in the study of GRBs: beam-smearing and inhomogeneous, anisotropic sky sampling. Knowledge of these effects is important to analysis of GRB angular and intensity distributions. In addition to determining that the bursts are local, it is hoped that analysis of such distributions will allow boundaries to be placed on the true GRB spatial distribution and luminosity function. The technique for studying GRB spatial and luminosity distributions is direct. Results of BATSE analyses are compared to Monte Carlo models parameterized by a variety of spatial and luminosity characteristics.

  18. A comprehensive evaluation of various sensitivity analysis methods: A case study with a hydrological model

    DOE PAGES

    Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...

    2014-01-01

    Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less

  19. The effectiveness of the bone bridge transtibial amputation technique: A systematic review of high-quality evidence.

    PubMed

    Kahle, Jason T; Highsmith, M Jason; Kenney, John; Ruth, Tim; Lunseth, Paul A; Ertl, Janos

    2017-06-01

    This literature review was undertaken to determine if commonly held views about the benefits of a bone bridge technique are supported by the literature. Four databases were searched for articles pertaining to surgical strategies specific to a bone bridge technique of the transtibial amputee. A total of 35 articles were identified as potential articles. Authors included methodology that was applied to separate topics. Following identification, articles were excluded if they were determined to be low quality evidence or not pertinent. Nine articles were identified to be pertinent to one of the topics: Perioperative Care, Acute Care, Subjective Analysis and Function. Two articles sorted into multiple topics. Two articles were sorted into the Perioperative Care topic, 4 articles sorted into the Acute Care topic, 2 articles into the Subjective Analysis topic and 5 articles into the Function topic. There are no high quality (level one or two) clinical trials reporting comparisons of the bone bridge technique to traditional methods. There is limited evidence supporting the clinical outcomes of the bone bridge technique. There is no agreement supporting or discouraging the perioperative and acute care aspects of the bone bridge technique. There is no evidence defining an interventional comparison of the bone bridge technique. Current level III evidence supports a bone bridge technique as an equivalent option to the non-bone bridge transtibial amputation technique. Formal level I and II clinical trials will need to be considered in the future to guide clinical practice. Clinical relevance Clinical Practice Guidelines are evidence based. This systematic literature review identifies the highest quality evidence to date which reports a consensus of outcomes agreeing bone bridge is as safe and effective as alternatives. The clinical relevance is understanding bone bridge could additionally provide a mechanistic advantage for the transtibial amputee.

  20. Surface and Thin Film Analysis during Metal Organic Vapour Phase Epitaxial Growth

    NASA Astrophysics Data System (ADS)

    Richter, Wolfgang

    2007-06-01

    In-situ analysis of epitaxial growth is the essential ingredient in order to understand the growth process, to optimize growth and last but not least to monitor or even control the epitaxial growth on a microscopic scale. In MBE (molecular beam epitaxy) in-situ analysis tools existed right from the beginning because this technique developed from Surface Science technology with all its electron based analysis tools (LEED, RHEED, PES etc). Vapour Phase Epitaxy, in contrast, remained for a long time in an empirical stage ("alchemy") because only post growth characterisations like photoluminescence, Hall effect and electrical conductivity were available. Within the last two decades, however, optical techniques were developed which provide similar capabilities as in MBE for Vapour Phase growth. I will discuss in this paper the potential of Reflectance Anisotropy Spectroscopy (RAS) and Spectroscopic Ellipsometry (SE) for the growth of thin epitaxial semiconductor layers with zincblende (GaAs etc) and wurtzite structure (GaN etc). Other techniques and materials will be also mentioned.

  1. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  2. Techniques of lumbar-sacral spine fusion in spondylosis: systematic literature review and meta-analysis of randomized clinical trials.

    PubMed

    Umeta, Ricardo S G; Avanzi, Osmar

    2011-07-01

    Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Application Of Laser Induced Breakdown Spectroscopy (LIBS) Technique In Investigation Of Historical Metal Threads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Kareem, O.; Khedr, A.; Abdelhamid, M.

    Analysis of the composition of an object is a necessary step in the documentation of the properties of this object for estimating its condition. Also this is an important task for establishing an appropriate conservation treatment of an object or to follow up the result of the application of the suggested treatments. There has been an important evolution in the methods used for analysis of metal threads since the second half of the twentieth century. Today, the main considerations of selecting a method are based on the diagnostic power, representative sampling, reproducibility, destructive nature/invasiveness of analysis and accessibility to themore » appropriate instrument. This study aims at evaluating the usefulness of the use of Laser Induced Breakdown Spectroscopy (LIBS) Technique for analysis of historical metal threads. In this study various historical metal threads collected from different museums were investigated using (LIBS) technique. For evaluating usefulness of the suggested analytical protocol of this technique, the same investigated metal thread samples were investigated with Scanning Electron Microscope (SEM) with energy-dispersive x-ray analyzer (EDX) which is reported in conservation field as the best method, to determine the chemical composition, and corrosion of investigated metal threads. The results show that all investigated metal threads in the present study are too dirty, strongly damaged and corroded with different types of corrosion products. Laser Induced Breakdown Spectroscopy (LIBS) Technique is considered very useful technique that can be used safely for investigating historical metal threads. It is, in fact, very useful tool as a noninvasive method for analysis of historical metal threads. The first few laser shots are very useful for the investigation of the corrosion and dirt layer, while the following shots are very useful and effective for investigating the coating layer. Higher number of laser shots are very useful for the main composition of the metal thread. There is a necessity to carry out further research to investigate and determine the most appropriate and effective approaches and methods for conservation of these metal threads.« less

  4. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  5. Varietal discrimination of hop pellets by near and mid infrared spectroscopy.

    PubMed

    Machado, Julio C; Faria, Miguel A; Ferreira, Isabel M P L V O; Páscoa, Ricardo N M J; Lopes, João A

    2018-04-01

    Hop is one of the most important ingredients of beer production and several varieties are commercialized. Therefore, it is important to find an eco-real-time-friendly-low-cost technique to distinguish and discriminate hop varieties. This paper describes the development of a method based on vibrational spectroscopy techniques, namely near- and mid-infrared spectroscopy, for the discrimination of 33 commercial hop varieties. A total of 165 samples (five for each hop variety) were analysed by both techniques. Principal component analysis, hierarchical cluster analysis and partial least squares discrimination analysis were the chemometric tools used to discriminate positively the hop varieties. After optimizing the spectral regions and pre-processing methods a total of 94.2% and 96.6% correct hop varieties discrimination were obtained for near- and mid-infrared spectroscopy, respectively. The results obtained demonstrate the suitability of these vibrational spectroscopy techniques to discriminate different hop varieties and consequently their potential to be used as an authenticity tool. Compared with the reference procedures normally used for hops variety discrimination these techniques are quicker, cost-effective, non-destructive and eco-friendly. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Child-Parent Interventions for Childhood Anxiety Disorders: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Brendel, Kristen Esposito; Maynard, Brandy R.

    2014-01-01

    Objective: This study compared the effects of direct child-parent interventions to the effects of child-focused interventions on anxiety outcomes for children with anxiety disorders. Method: Systematic review methods and meta-analytic techniques were employed. Eight randomized controlled trials examining effects of family cognitive behavior…

  7. SVD analysis of Aura TES spectral residuals

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Kulawik, Susan S.; Rodgers, Clive D.; Bowman, Kevin W.

    2005-01-01

    Singular Value Decomposition (SVD) analysis is both a powerful diagnostic tool and an effective method of noise filtering. We present the results of an SVD analysis of an ensemble of spectral residuals acquired in September 2004 from a 16-orbit Aura Tropospheric Emission Spectrometer (TES) Global Survey and compare them to alternative methods such as zonal averages. In particular, the technique highlights issues such as the orbital variation of instrument response and incompletely modeled effects of surface emissivity and atmospheric composition.

  8. Multifactorial analysis of human blood cell responses to clinical total body irradiation

    NASA Technical Reports Server (NTRS)

    Yuhas, J. M.; Stokes, T. R.; Lushbaugh, C. C.

    1972-01-01

    Multiple regression analysis techniques are used to study the effects of therapeutic radiation exposure, number of fractions, and time on such quantal responses as tumor control and skin injury. The potential of these methods for the analysis of human blood cell responses is demonstrated and estimates are given of the effects of total amount of exposure and time of protraction in determining the minimum white blood cell concentration observed after exposure of patients from four disease groups.

  9. Scaling range of power laws that originate from fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Grech, Dariusz; Mazur, Zygmunt

    2013-05-01

    We extend our previous study of scaling range properties performed for detrended fluctuation analysis (DFA) [Physica A0378-437110.1016/j.physa.2013.01.049 392, 2384 (2013)] to other techniques of fluctuation analysis (FA). The new technique, called modified detrended moving average analysis (MDMA), is introduced, and its scaling range properties are examined and compared with those of detrended moving average analysis (DMA) and DFA. It is shown that contrary to DFA, DMA and MDMA techniques exhibit power law dependence of the scaling range with respect to the length of the searched signal and with respect to the accuracy R2 of the fit to the considered scaling law imposed by DMA or MDMA methods. This power law dependence is satisfied for both uncorrelated and autocorrelated data. We find also a simple generalization of this power law relation for series with a different level of autocorrelations measured in terms of the Hurst exponent. Basic relations between scaling ranges for different techniques are also discussed. Our findings should be particularly useful for local FA in, e.g., econophysics, finances, or physiology, where the huge number of short time series has to be examined at once and wherever the preliminary check of the scaling range regime for each of the series separately is neither effective nor possible.

  10. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  11. The Efficacy of Movement Representation Techniques for Treatment of Limb Pain--A Systematic Review and Meta-Analysis.

    PubMed

    Thieme, Holm; Morkisch, Nadine; Rietz, Christian; Dohle, Christian; Borgetto, Bernhard

    2016-02-01

    Relatively new evidence suggests that movement representation techniques (ie, therapies that use the observation and/or imagination of normal pain-free movements, such as mirror therapy, motor imagery, or movement and/or action observation) might be effective in reduction of some types of limb pain. To summarize the evidence regarding the efficacy of those techniques, a systematic review with meta-analysis was performed. We searched Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL, AMED, PsychINFO, Physiotherapy Evidence Database, and OT-seeker up to August 2014 and hand-searched further relevant resources for randomized controlled trials that studied the efficacy of movement representation techniques in reduction of limb pain. The outcomes of interest were pain, disability, and quality of life. Study selection and data extraction were performed by 2 reviewers independently. We included 15 trials on the effects of mirror therapy, (graded) motor imagery, and action observation in patients with complex regional pain syndrome, phantom limb pain, poststroke pain, and nonpathological (acute) pain. Overall, movement representation techniques were found to be effective in reduction of pain (standardized mean difference [SMD] = -.82, 95% confidence interval [CI], -1.32 to -.31, P = .001) and disability (SMD = .72, 95% CI, .22-1.22, P = .004) and showed a positive but nonsignificant effect on quality of life (SMD = 2.61, 85% CI, -3.32 to 8.54, P = .39). Especially mirror therapy and graded motor imagery should be considered for the treatment of patients with complex regional pain syndrome. Furthermore, the results indicate that motor imagery could be considered as a potential effective treatment in patients with acute pain after trauma and surgery. To date, there is no evidence for a pain reducing effect of movement representation techniques in patients with phantom limb pain and poststroke pain other than complex regional pain syndrome. In this systematic review we synthesize the evidence for the efficacy of movement representation techniques (ie, motor imagery, mirror therapy, or action observation) for treatment of limb pain. Our findings suggest effective pain reduction in some types of limb pain. Further research should address specific questions on the optimal type and dose of therapy. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  12. Estimating the settling velocity of bioclastic sediment using common grain-size analysis techniques

    USGS Publications Warehouse

    Cuttler, Michael V. W.; Lowe, Ryan J.; Falter, James L.; Buscombe, Daniel D.

    2017-01-01

    Most techniques for estimating settling velocities of natural particles have been developed for siliciclastic sediments. Therefore, to understand how these techniques apply to bioclastic environments, measured settling velocities of bioclastic sedimentary deposits sampled from a nearshore fringing reef in Western Australia were compared with settling velocities calculated using results from several common grain-size analysis techniques (sieve, laser diffraction and image analysis) and established models. The effects of sediment density and shape were also examined using a range of density values and three different models of settling velocity. Sediment density was found to have a significant effect on calculated settling velocity, causing a range in normalized root-mean-square error of up to 28%, depending upon settling velocity model and grain-size method. Accounting for particle shape reduced errors in predicted settling velocity by 3% to 6% and removed any velocity-dependent bias, which is particularly important for the fastest settling fractions. When shape was accounted for and measured density was used, normalized root-mean-square errors were 4%, 10% and 18% for laser diffraction, sieve and image analysis, respectively. The results of this study show that established models of settling velocity that account for particle shape can be used to estimate settling velocity of irregularly shaped, sand-sized bioclastic sediments from sieve, laser diffraction, or image analysis-derived measures of grain size with a limited amount of error. Collectively, these findings will allow for grain-size data measured with different methods to be accurately converted to settling velocity for comparison. This will facilitate greater understanding of the hydraulic properties of bioclastic sediment which can help to increase our general knowledge of sediment dynamics in these environments.

  13. PARENT Quick Blind Round-Robin Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Braatz, Brett G.; Heasler, Patrick G.; Meyer, Ryan M.

    The U.S. Nuclear Regulatory Commission has established the Program to Assess the Reliability of Emerging Nondestructive Techniques (PARENT) whose goal is to investigate the effectiveness of current and novel nondestructive examination procedures and techniques to find flaws in nickel-alloy welds and base materials. This is to be done by conducting a series of open and blind international round-robin tests on a set of piping components that include large-bore dissimilar metal welds, small-bore dissimilar metal welds, and bottom-mounted instrumentation penetration welds. The blind testing is being conducted in two segments, one is called Quick-Blind and the other is called Blind. Themore » Quick-Blind testing and destructive analysis of the test blocks has been completed. This report describes the four Quick-Blind test blocks used, summarizes their destructive analysis, gives an overview of the nondestructive evaluation (NDE) techniques applied, provides an analysis inspection data, and presents the conclusions drawn.« less

  14. Improved cardiac motion detection from ultrasound images using TDIOF: a combined B-mode/ tissue Doppler approach

    NASA Astrophysics Data System (ADS)

    Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.

    2013-03-01

    Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.

  15. DataView: a computational visualisation system for multidisciplinary design and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Chengen

    2016-01-01

    Rapidly processing raw data and effectively extracting underlining information from huge volumes of multivariate data become essential to all decision-making processes in sectors like finance, government, medical care, climate analysis, industries, science, etc. Remarkably, visualisation is recognised as a fundamental technology that props up human comprehension, cognition and utilisation of burgeoning amounts of heterogeneous data. This paper presents a computational visualisation system, named DataView, which has been developed for graphically displaying and capturing outcomes of multiphysics problem-solvers widely used in engineering fields. The DataView is functionally composed of techniques for table/diagram representation, and graphical illustration of scalar, vector and tensor fields. The field visualisation techniques are implemented on the basis of a range of linear and non-linear meshes, which flexibly adapts to disparate data representation schemas adopted by a variety of disciplinary problem-solvers. The visualisation system has been successfully applied to a number of engineering problems, of which some illustrations are presented to demonstrate effectiveness of the visualisation techniques.

  16. The art of spacecraft design: A multidisciplinary challenge

    NASA Technical Reports Server (NTRS)

    Abdi, F.; Ide, H.; Levine, M.; Austel, L.

    1989-01-01

    Actual design turn-around time has become shorter due to the use of optimization techniques which have been introduced into the design process. It seems that what, how and when to use these optimization techniques may be the key factor for future aircraft engineering operations. Another important aspect of this technique is that complex physical phenomena can be modeled by a simple mathematical equation. The new powerful multilevel methodology reduces time-consuming analysis significantly while maintaining the coupling effects. This simultaneous analysis method stems from the implicit function theorem and system sensitivity derivatives of input variables. Use of the Taylor's series expansion and finite differencing technique for sensitivity derivatives in each discipline makes this approach unique for screening dominant variables from nondominant variables. In this study, the current Computational Fluid Dynamics (CFD) aerodynamic and sensitivity derivative/optimization techniques are applied for a simple cone-type forebody of a high-speed vehicle configuration to understand basic aerodynamic/structure interaction in a hypersonic flight condition.

  17. An analysis of short pulse and dual frequency radar techniques for measuring ocean wave spectra from satellites

    NASA Technical Reports Server (NTRS)

    Jackson, F. C.

    1980-01-01

    Scanning beam microwave radars were used to measure ocean wave directional spectra from satellites. In principle, surface wave spectral resolution in wave number can be obtained using either short pulse (SP) or dual frequency (DF) techniques; in either case, directional resolution obtains naturally as a consequence of a Bragg-like wave front matching. A four frequency moment characterization of backscatter from the near vertical using physical optics in the high frequency limit was applied to an analysis of the SP and DF measurement techniques. The intrinsic electromagnetic modulation spectrum was to the first order in wave steepness proportional to the large wave directional slope spectrum. Harmonic distortion was small and was a minimum near 10 deg incidence. NonGaussian wave statistics can have an effect comparable to that in the second order of scattering from a normally distributed sea surface. The SP technique is superior to the DF technique in terms of measurement signal to noise ratio and contrast ratio.

  18. Application of remote sensing to land and water resource planning: The Pocomoke River Basin, Maryland

    NASA Technical Reports Server (NTRS)

    Wildesen, S. E.; Phillips, E. P.

    1981-01-01

    Because of the size of the Pocomoke River Basin, the inaccessibility of certain areas, and study time constraints, several remote sensing techniques were used to collect base information on the river corridor, (a 23.2 km channel) and on a 1.2 km wooded floodplain. This information provided an adequate understanding of the environment and its resources, thus enabling effective management options to be designed. The remote sensing techniques used for assessment included manual analysis of high altitude color-infrared photography, computer-assisted analysis of LANDSAT-2 imagery, and the application of airborne oceanographic Lidar for topographic mapping. Results show that each techniques was valuable in providing the needed base data necessary for resource planning.

  19. Verification of Orthogrid Finite Element Modeling Techniques

    NASA Technical Reports Server (NTRS)

    Steeve, B. E.

    1996-01-01

    The stress analysis of orthogrid structures, specifically with I-beam sections, is regularly performed using finite elements. Various modeling techniques are often used to simplify the modeling process but still adequately capture the actual hardware behavior. The accuracy of such 'Oshort cutso' is sometimes in question. This report compares three modeling techniques to actual test results from a loaded orthogrid panel. The finite element models include a beam, shell, and mixed beam and shell element model. Results show that the shell element model performs the best, but that the simpler beam and beam and shell element models provide reasonable to conservative results for a stress analysis. When deflection and stiffness is critical, it is important to capture the effect of the orthogrid nodes in the model.

  20. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  1. Developments in ICP-MS: electrochemically modulated liquid chromatography for the clean-up of ICP-MS blanks and reduction of matrix effects by flow injection ICP-MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gross, Cory Thomas

    2008-01-01

    The focus of this dissertation is the development of techniques with which to enhance the existing abilities of inductively coupled plasma mass spectrometry (ICP-MS). ICP-MS is a powerful technique for trace metal analysis in samples of many types, but like any technique it has certain strengths and weaknesses. Attempts are made to improve upon those strengths and to overcome certain weaknesses.

  2. Application of Mathematical Signal Processing Techniques to Mission Systems. (l’Application des techniques mathematiques du traitement du signal aux systemes de conduite des missions)

    DTIC Science & Technology

    1999-11-01

    represents the linear time invariant (LTI) response of the combined analysis /synthesis system while the second repre- sents the aliasing introduced into...effectively to implement voice scrambling systems based on time - frequency permutation . The most general form of such a system is shown in Fig. 22 where...92201 NEUILLY-SUR-SEINE CEDEX, FRANCE RTO LECTURE SERIES 216 Application of Mathematical Signal Processing Techniques to Mission Systems (1

  3. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  4. Improving the analysis of slug tests

    USGS Publications Warehouse

    McElwee, C.D.

    2002-01-01

    This paper examines several techniques that have the potential to improve the quality of slug test analysis. These techniques are applicable in the range from low hydraulic conductivities with overdamped responses to high hydraulic conductivities with nonlinear oscillatory responses. Four techniques for improving slug test analysis will be discussed: use of an extended capability nonlinear model, sensitivity analysis, correction for acceleration and velocity effects, and use of multiple slug tests. The four-parameter nonlinear slug test model used in this work is shown to allow accurate analysis of slug tests with widely differing character. The parameter ?? represents a correction to the water column length caused primarily by radius variations in the wellbore and is most useful in matching the oscillation frequency and amplitude. The water column velocity at slug initiation (V0) is an additional model parameter, which would ideally be zero but may not be due to the initiation mechanism. The remaining two model parameters are A (parameter for nonlinear effects) and K (hydraulic conductivity). Sensitivity analysis shows that in general ?? and V0 have the lowest sensitivity and K usually has the highest. However, for very high K values the sensitivity to A may surpass the sensitivity to K. Oscillatory slug tests involve higher accelerations and velocities of the water column; thus, the pressure transducer responses are affected by these factors and the model response must be corrected to allow maximum accuracy for the analysis. The performance of multiple slug tests will allow some statistical measure of the experimental accuracy and of the reliability of the resulting aquifer parameters. ?? 2002 Elsevier Science B.V. All rights reserved.

  5. Problems and Issues in Meta-Analysis.

    ERIC Educational Resources Information Center

    George, Carrie A.

    Single studies, by themselves, rarely explain the effect of treatments or interventions definitively in the social sciences. Researchers created meta-analysis in the 1970s to address this need. Since then, meta-analytic techniques have been used to support certain treatment modalities and to influence policymakers. Although these techniques…

  6. The new Zero-P implant can effectively reduce the risk of postoperative dysphagia and complications compared with the traditional anterior cage and plate: a systematic review and meta-analysis.

    PubMed

    Yin, Mengchen; Ma, Junming; Huang, Quan; Xia, Ye; Shen, Qixing; Zhao, Chenglong; Tao, Jun; Chen, Ni; Yu, Zhingxing; Ye, Jie; Mo, Wen; Xiao, Jianru

    2016-10-18

    The low-profile angle-stable spacer Zero-P is a new kind of cervical fusion system that is claimed to limit the potential drawbacks and complications. The purpose of this meta-analysis was to compare the clinical and radiological results of the new Zero-P implant with those of the traditional anterior cage and plate in the treatment of symptomatic cervical spondylosis, and provides clinicians with evidence on which to base their clinical decision making. The following electronic databases were searched: PMedline, PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, Evidence Based Medicine Reviews, VIP, and CNKI. Conference posters and abstracts were also electronically searched. The efficacy was evaluated in intraoperative time, intraoperative blood loss, fusion rate and dysphagia. For intraoperative time and intraoperative blood loss, the meta-analysis revealed that the Zero-P surgical technique is not superior to the cage and plate technique . For fusion rate, the two techniques both had good bone fusion, however, this difference is not statistically significant. For decrease of JOA and dysphagia, the pooled data showed that the Zero-P surgical technique is superior to the cage and plate technique. Zero-P interbody fusion can attain good clinical efficacy and a satisfactory fusion rate in the treatment of symptomatic cervical spondylosis. It also can effectively reduce the risk of postoperative dysphagia and its complications. However, owing to the lack of long-term follow-up, its long-term efficacy remains unknown.

  7. A new methodology based on functional principal component analysis to study postural stability post-stroke.

    PubMed

    Sánchez-Sánchez, M Luz; Belda-Lois, Juan-Manuel; Mena-Del Horno, Silvia; Viosca-Herrero, Enrique; Igual-Camacho, Celedonia; Gisbert-Morant, Beatriz

    2018-05-05

    A major goal in stroke rehabilitation is the establishment of more effective physical therapy techniques to recover postural stability. Functional Principal Component Analysis provides greater insight into recovery trends. However, when missing values exist, obtaining functional data presents some difficulties. The purpose of this study was to reveal an alternative technique for obtaining the Functional Principal Components without requiring the conversion to functional data beforehand and to investigate this methodology to determine the effect of specific physical therapy techniques in balance recovery trends in elderly subjects with hemiplegia post-stroke. A randomized controlled pilot trial was developed. Thirty inpatients post-stroke were included. Control and target groups were treated with the same conventional physical therapy protocol based on functional criteria, but specific techniques were added to the target group depending on the subjects' functional level. Postural stability during standing was quantified by posturography. The assessments were performed once a month from the moment the participants were able to stand up to six months post-stroke. The target group showed a significant improvement in postural control recovery trend six months after stroke that was not present in the control group. Some of the assessed parameters revealed significant differences between treatment groups (P < 0.05). The proposed methodology allows Functional Principal Component Analysis to be performed when data is scarce. Moreover, it allowed the dynamics of recovery of two different treatment groups to be determined, showing that the techniques added in the target group increased postural stability compared to the base protocol. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Improvement of modulation bandwidth in electroabsorption-modulated laser by utilizing the resonance property in bonding wire.

    PubMed

    Kwon, Oh Kee; Han, Young Tak; Baek, Yong Soon; Chung, Yun C

    2012-05-21

    We present and demonstrate a simple and cost-effective technique for improving the modulation bandwidth of electroabsorption-modulated laser (EML). This technique utilizes the RF resonance caused by the EML chip (i.e., junction capacitance) and bonding wire (i.e, wire inductance). We analyze the effects of the lengths of the bonding wires on the frequency responses of EML by using an equivalent circuit model. To verify this analysis, we package a lumped EML chip on the sub-mount and measure its frequency responses. The results show that, by using the proposed technique, we can increase the modulation bandwidth of EML from ~16 GHz to ~28 GHz.

  9. Sensitivity analysis for linear structural equation models, longitudinal mediation with latent growth models and blended learning in biostatistics education

    NASA Astrophysics Data System (ADS)

    Sullivan, Adam John

    In chapter 1, we consider the biases that may arise when an unmeasured confounder is omitted from a structural equation model (SEM) and sensitivity analysis techniques to correct for such biases. We give an analysis of which effects in an SEM are and are not biased by an unmeasured confounder. It is shown that a single unmeasured confounder will bias not just one but numerous effects in an SEM. We present sensitivity analysis techniques to correct for biases in total, direct, and indirect effects when using SEM analyses, and illustrate these techniques with a study of aging and cognitive function. In chapter 2, we consider longitudinal mediation with latent growth curves. We define the direct and indirect effects using counterfactuals and consider the assumptions needed for identifiability of those effects. We develop models with a binary treatment/exposure followed by a model where treatment/exposure changes with time allowing for treatment/exposure-mediator interaction. We thus formalize mediation analysis with latent growth curve models using counterfactuals, makes clear the assumptions and extends these methods to allow for exposure mediator interactions. We present and illustrate the techniques with a study on Multiple Sclerosis(MS) and depression. In chapter 3, we report on a pilot study in blended learning that took place during the Fall 2013 and Summer 2014 semesters here at Harvard. We blended the traditional BIO 200: Principles of Biostatistics and created ID 200: Principles of Biostatistics and epidemiology. We used materials from the edX course PH207x: Health in Numbers: Quantitative Methods in Clinical & Public Health Research and used. These materials were used as a video textbook in which students would watch a given number of these videos prior to class. Using surveys as well as exam data we informally assess these blended classes from the student's perspective as well as a comparison of these students with students in another course, BIO 201: Introduction to Statistical Methods in Fall 2013 as well as students from BIO 200 in Fall semesters of 1992 and 1993. We then suggest improvements upon our original course designs and follow up with an informal look at how these implemented changes affected the second offering of the newly blended ID 200 in Summer 2014.

  10. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  11. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  12. SMALL COLOUR VISION VARIATIONS AND THEIR EFFECT IN VISUAL COLORIMETRY,

    DTIC Science & Technology

    COLOR VISION, PERFORMANCE(HUMAN), TEST EQUIPMENT, PERFORMANCE(HUMAN), CORRELATION TECHNIQUES, STATISTICAL PROCESSES, COLORS, ANALYSIS OF VARIANCE, AGING(MATERIALS), COLORIMETRY , BRIGHTNESS, ANOMALIES, PLASTICS, UNITED KINGDOM.

  13. An Analysis of a Comprehensive Evaluation Model for Guided Group Interaction Techniques with Juvenile Delinquents. Final Report.

    ERIC Educational Resources Information Center

    Silverman, Mitchell

    Reported are the first phase activities of a longitudinal project designed to evaluate the effectiveness of Guided Group Interaction (GGI) technique as a meaningful approach in the field of corrections. The main findings relate to the establishment of reliability for the main components of the Revised Behavior Scores System developed to assess the…

  14. An investigation of a mathematical model for atmospheric absorption spectra

    NASA Technical Reports Server (NTRS)

    Niple, E. R.

    1979-01-01

    A computer program that calculates absorption spectra for slant paths through the atmosphere is described. The program uses an efficient convolution technique (Romberg integration) to simulate instrument resolution effects. A brief information analysis is performed on a set of calculated spectra to illustrate how such techniques may be used to explore the quality of the information in a spectrum.

  15. Interactive Exploration of Big Scientific Data: New Representations and Techniques.

    PubMed

    Hjelmervik, Jon M; Barrowclough, Oliver J D

    2016-01-01

    Although splines have been in popular use in CAD for more than half a century, spline research is still an active field, driven by the challenges we are facing today within isogeometric analysis and big data. Splines are likely to play a vital future role in enabling effective big data exploration techniques in 3D, 4D, and beyond.

  16. Effective Management Selection: The Analysis of Behavior by Simulation Techniques.

    ERIC Educational Resources Information Center

    Jaffee, Cabot L.

    This book presents a system by which feedback might be generated and used as a basis for organizational change. The major areas covered consist of the development of a rationale for the use of simulation in the selection of supervisors, a description of actual techniques, and a method for training individuals in the use of the material. The…

  17. Finite-size effect and the components of multifractality in transport economics volatility based on multifractal detrending moving average method

    NASA Astrophysics Data System (ADS)

    Chen, Feier; Tian, Kang; Ding, Xiaoxu; Miao, Yuqi; Lu, Chunxia

    2016-11-01

    Analysis of freight rate volatility characteristics attracts more attention after year 2008 due to the effect of credit crunch and slowdown in marine transportation. The multifractal detrended fluctuation analysis technique is employed to analyze the time series of Baltic Dry Bulk Freight Rate Index and the market trend of two bulk ship sizes, namely Capesize and Panamax for the period: March 1st 1999-February 26th 2015. In this paper, the degree of the multifractality with different fluctuation sizes is calculated. Besides, multifractal detrending moving average (MF-DMA) counting technique has been developed to quantify the components of multifractal spectrum with the finite-size effect taken into consideration. Numerical results show that both Capesize and Panamax freight rate index time series are of multifractal nature. The origin of multifractality for the bulk freight rate market series is found mostly due to nonlinear correlation.

  18. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  19. Performance evaluation of the RITG148+ set of TomoTherapy quality assurance tools using RTQA2 radiochromic film.

    PubMed

    Lobb, Eric C

    2016-07-08

    Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.

  20. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  1. Detrended Cross Correlation Analysis: a new way to figure out the underlying cause of global warming

    NASA Astrophysics Data System (ADS)

    Hazra, S.; Bera, S. K.

    2016-12-01

    Analysing non-stationary time series is a challenging task in earth science, seismology, solar physics, climate, biology, finance etc. Most of the cases external noise like oscillation, high frequency noise, low frequency noise in different scales lead to erroneous result. Many statistical methods are proposed to find the correlation between two non-stationary time series. N. Scafetta and B. J. West, Phys. Rev. Lett. 90, 248701 (2003), reported a strong relationship between solar flare intermittency (SFI) and global temperature anomalies (GTA) using diffusion entropy analysis. It has been recently shown that detrended cross correlation analysis (DCCA) is better technique to remove the effects of any unwanted signal as well as local and periodic trend. Thus DCCA technique is more suitable to find the correlation between two non-stationary time series. By this technique, correlation coefficient at different scale can be estimated. Motivated by this here we have applied a new DCCA technique to find the relationship between SFI and GTA. We have also applied this technique to find the relationship between GTA and carbon di-oxide density, GTA and methane density on earth atmosphere. In future we will try to find the relationship between GTA and aerosols present in earth atmosphere, water vapour density on earth atmosphere, ozone depletion etc. This analysis will help us for better understanding about the reason behind global warming

  2. Cost-effectiveness study of the microbiological diagnosis of tuberculosis using geneXpert MTB/RIF®.

    PubMed

    Herráez, Óscar; Asencio-Egea, María Ángeles; Huertas-Vaquero, María; Carranza-González, Rafael; Castellanos-Monedero, Jesús; Franco-Huerta, María; Barberá-Farré, José Ramón; Tenías-Burillo, José María

    To perform a cost-effectiveness analysis of a molecular biology technique for the diagnosis of tuberculosis compared to the classical diagnostic alternative. A cost-effectiveness analysis was performed to evaluate the theoretical implementation of a molecular biology method including two alternative techniques for early detection of Mycobacterium tuberculosis Complex, and resistance to rifampicin (alternative1: one determination in selected patients; alternative2: two determinations in all the patients). Both alternatives were compared with the usual procedure for microbiological diagnosis of tuberculosis (staining and microbiological culture), and was accomplished on 1,972 patients in the period in 2008-2012. The effectiveness was measured in QALYs, and the uncertainty was assessed by univariate, multivariate and probabilistic analysis of sensitivity. A value of €8,588/QALYs was obtained by the usual method. Total expenditure with the alternative1 was €8,487/QALYs, whereas with alternative2, the cost-effectiveness ratio amounted to €2,960/QALYs. Greater diagnostic efficiency was observed by applying the alternative2, reaching a 75% reduction in the number of days that a patient with tuberculosis remains without an adequate treatment, and a 70% reduction in the number of days that a patient without tuberculosis remains in hospital. The implementation of a molecular microbiological technique in the diagnosis of tuberculosis is extremely cost-effective compared to the usual method. Its introduction into the routine diagnostic procedure could lead to an improvement in quality care for patients, given that it would avoid both unnecessary hospitalisations and treatments, and reflected in economic savings to the hospital. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  3. Coupling Analysis of Heat Island Effects, Vegetation Coverage and Urban Flood in Wuhan

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Liu, Q.; Fan, W.; Wang, G.

    2018-04-01

    In this paper, satellite image, remote sensing technique and geographic information system technique are main technical bases. Spectral and other factors comprehensive analysis and visual interpretation are main methods. We use GF-1 and Landsat8 remote sensing satellite image of Wuhan as data source, and from which we extract vegetation distribution, urban heat island relative intensity distribution map and urban flood submergence range. Based on the extracted information, through spatial analysis and regression analysis, we find correlations among heat island effect, vegetation coverage and urban flood. The results show that there is a high degree of overlap between of urban heat island and urban flood. The area of urban heat island has buildings with little vegetation cover, which may be one of the reasons for the local heavy rainstorms. Furthermore, the urban heat island has a negative correlation with vegetation coverage, and the heat island effect can be alleviated by the vegetation to a certain extent. So it is easy to understand that the new industrial zones and commercial areas which under constructions distribute in the city, these land surfaces becoming bare or have low vegetation coverage, can form new heat islands easily.

  4. The Meditative Mind: A Comprehensive Meta-Analysis of MRI Studies

    PubMed Central

    2015-01-01

    Over the past decade mind and body practices, such as yoga and meditation, have raised interest in different scientific fields; in particular, the physiological mechanisms underlying the beneficial effects observed in meditators have been investigated. Neuroimaging studies have studied the effects of meditation on brain structure and function and findings have helped clarify the biological underpinnings of the positive effects of meditation practice and the possible integration of this technique in standard therapy. The large amount of data collected thus far allows drawing some conclusions about the neural effects of meditation practice. In the present study we used activation likelihood estimation (ALE) analysis to make a coordinate-based meta-analysis of neuroimaging data on the effects of meditation on brain structure and function. Results indicate that meditation leads to activation in brain areas involved in processing self-relevant information, self-regulation, focused problem-solving, adaptive behavior, and interoception. Results also show that meditation practice induces functional and structural brain modifications in expert meditators, especially in areas involved in self-referential processes such as self-awareness and self-regulation. These results demonstrate that a biological substrate underlies the positive pervasive effect of meditation practice and suggest that meditation techniques could be adopted in clinical populations and to prevent disease. PMID:26146618

  5. Effective Analysis of Reaction Time Data

    ERIC Educational Resources Information Center

    Whelan, Robert

    2008-01-01

    Most analyses of reaction time (RT) data are conducted by using the statistical techniques with which psychologists are most familiar, such as analysis of variance on the sample mean. Unfortunately, these methods are usually inappropriate for RT data, because they have little power to detect genuine differences in RT between conditions. In…

  6. The Effect of Literature Circles on Text Analysis and Reading Desire

    ERIC Educational Resources Information Center

    Karatay, Halit

    2017-01-01

    In order to make teaching activities more appealing, different techniques and strategies have been constantly employed. This study utilized the strategy of "literature circles" to improve the text-analysis skills, reading desires, and interests of prospective teachers of Turkish. "Literature circles" was not chosen to be used…

  7. Implementation and evaluation of ILLIAC 4 algorithms for multispectral image processing

    NASA Technical Reports Server (NTRS)

    Swain, P. H.

    1974-01-01

    Data concerning a multidisciplinary and multi-organizational effort to implement multispectral data analysis algorithms on a revolutionary computer, the Illiac 4, are reported. The effectiveness and efficiency of implementing the digital multispectral data analysis techniques for producing useful land use classifications from satellite collected data were demonstrated.

  8. The Campbell Collaboration's Systematic Review and Meta-Analysis Online Training Videos

    ERIC Educational Resources Information Center

    Polanin, Joshua R.; Pigott, Terri D.

    2013-01-01

    Systematic reviews and meta-analysis are techniques for synthesizing primary empirical studies to produce a summary of effects. To facilitate this goal, the Campbell Collaboration (C2) supports reviews within the disciplines of crime and justice, education, international development, and social welfare. At the annual Campbell Colloquium, experts…

  9. Corpus Use in Language Learning: A Meta-Analysis

    ERIC Educational Resources Information Center

    Boulton, Alex; Cobb, Tom

    2017-01-01

    This study applied systematic meta-analytic procedures to summarize findings from experimental and quasi-experimental investigations into the effectiveness of using the tools and techniques of corpus linguistics for second language learning or use, here referred to as data-driven learning (DDL). Analysis of 64 separate studies representing 88…

  10. Estimation of Errors in Force Platform Data

    ERIC Educational Resources Information Center

    Psycharakis, Stelios G.; Miller, Stuart

    2006-01-01

    Force platforms (FPs) are regularly used in the biomechanical analysis of sport and exercise techniques, often in combination with image-based motion analysis. Force time data, particularly when combined with joint positions and segmental inertia parameters, can be used to evaluate the effectiveness of a wide range of movement patterns in sport…

  11. Treating technology as a luxury? 10 necessary tools.

    PubMed

    Berger, Steven H

    2007-02-01

    Technology and techniques that every hospital should acquire and use for effective financial management include: Daily dashboards. Balanced scorecards. Benchmarking. Flexible budgeting and monitoring. Labor management systems. Nonlabor management analysis. Service, line, physician, and patient-level reporting and analysis. Cost accounting technology. Contract management technology. Denials management software.

  12. Analysis and Identification of Acid-Base Indicator Dyes by Thin-Layer Chromatography

    ERIC Educational Resources Information Center

    Clark, Daniel D.

    2007-01-01

    Thin-layer chromatography (TLC) is a very simple and effective technique that is used by chemists by different purposes, including the monitoring of the progress of a reaction. TLC can also be easily used for the analysis and identification of various acid-base indicator dyes.

  13. Optical analysis of crystal growth

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Passeur, Andrea; Harper, Sabrina

    1994-01-01

    Processing and data reduction of holographic images from Spacelab presents some interesting challenges in determining the effects of microgravity on crystal growth processes. Evaluation of several processing techniques, including the Computerized Holographic Image Processing System and the image processing software ITEX150, will provide fundamental information for holographic analysis of the space flight data.

  14. Drug Synthesis and Analysis on a Dime: A Capstone Medicinal Chemistry Experience for the Undergraduate Biochemistry Laboratory

    ERIC Educational Resources Information Center

    Streu, Craig N.; Reif, Randall D.; Neiles, Kelly Y.; Schech, Amanda J.; Mertz, Pamela S.

    2016-01-01

    Integrative, research-based experiences have shown tremendous potential as effective pedagogical approaches. Pharmaceutical development is an exciting field that draws heavily on organic chemistry and biochemistry techniques. A capstone drug synthesis/analysis laboratory is described where biochemistry students synthesize azo-stilbenoid compounds…

  15. Approaches to the Analysis of School Costs, an Introduction.

    ERIC Educational Resources Information Center

    Payzant, Thomas

    A review and general discussion of quantitative and qualitative techniques for the analysis of economic problems outside of education is presented to help educators discover new tools for planning, allocating, and evaluating educational resources. The pamphlet covers some major components of cost accounting, cost effectiveness, cost-benefit…

  16. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  17. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Survival analysis: A consumer-friendly method to estimate the optimum sucrose level in probiotic petit suisse.

    PubMed

    Esmerino, E A; Paixão, J A; Cruz, A G; Garitta, L; Hough, G; Bolini, H M A

    2015-11-01

    For years, just-about-right (JAR) scales have been among the most used techniques to obtain sensory information about consumer perception, but recently, some researchers have harshly criticized the technique. The present study aimed to apply survival analysis to estimate the optimum sucrose concentration in probiotic petit suisse cheese and compare the survival analysis to JAR scales to verify which technique more accurately predicted the optimum sucrose concentration according to consumer acceptability. Two panels of consumers (total=170) performed affective tests to determine the optimal concentration of sucrose in probiotic petit suisse using 2 different methods of analysis: JAR scales (n=85) and survival analysis (n=85). Then an acceptance test was conducted using naïve consumers (n=100) between 18 and 60 yr old, with 2 samples of petit suisse, one with the ideal sucrose determined by JAR scales and the other with the ideal sucrose content determined by survival analysis, to determine which formulation was in accordance with consumer acceptability. The results indicate that the 2 sensory methods were equally effective in predicting the optimum sucrose level in probiotic petit suisse cheese, and no significant differences were detected in any of the characteristics related to liking evaluated. However, survival analysis has important advantages over the JAR scales. Survival analysis has shown the potential to be an advantageous tool for dairy companies because it was able to accurately predict the optimum sucrose content in a consumer-friendly way and was also practical for researchers because experimental sensory work is simpler and has been shown to be more cost effective than JAR scales without losses of consumer acceptability. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets.

    PubMed

    Shuryak, Igor

    2017-01-01

    The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected "signal"; (5) using several machine learning methods to test the "signal's" sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation.

  20. Advantages of Synthetic Noise and Machine Learning for Analyzing Radioecological Data Sets

    PubMed Central

    Shuryak, Igor

    2017-01-01

    The ecological effects of accidental or malicious radioactive contamination are insufficiently understood because of the hazards and difficulties associated with conducting studies in radioactively-polluted areas. Data sets from severely contaminated locations can therefore be small. Moreover, many potentially important factors, such as soil concentrations of toxic chemicals, pH, and temperature, can be correlated with radiation levels and with each other. In such situations, commonly-used statistical techniques like generalized linear models (GLMs) may not be able to provide useful information about how radiation and/or these other variables affect the outcome (e.g. abundance of the studied organisms). Ensemble machine learning methods such as random forests offer powerful alternatives. We propose that analysis of small radioecological data sets by GLMs and/or machine learning can be made more informative by using the following techniques: (1) adding synthetic noise variables to provide benchmarks for distinguishing the performances of valuable predictors from irrelevant ones; (2) adding noise directly to the predictors and/or to the outcome to test the robustness of analysis results against random data fluctuations; (3) adding artificial effects to selected predictors to test the sensitivity of the analysis methods in detecting predictor effects; (4) running a selected machine learning method multiple times (with different random-number seeds) to test the robustness of the detected “signal”; (5) using several machine learning methods to test the “signal’s” sensitivity to differences in analysis techniques. Here, we applied these approaches to simulated data, and to two published examples of small radioecological data sets: (I) counts of fungal taxa in samples of soil contaminated by the Chernobyl nuclear power plan accident (Ukraine), and (II) bacterial abundance in soil samples under a ruptured nuclear waste storage tank (USA). We show that the proposed techniques were advantageous compared with the methodology used in the original publications where the data sets were presented. Specifically, our approach identified a negative effect of radioactive contamination in data set I, and suggested that in data set II stable chromium could have been a stronger limiting factor for bacterial abundance than the radionuclides 137Cs and 99Tc. This new information, which was extracted from these data sets using the proposed techniques, can potentially enhance the design of radioactive waste bioremediation. PMID:28068401

  1. System analysis in rotorcraft design: The past decade

    NASA Technical Reports Server (NTRS)

    Galloway, Thomas L.

    1988-01-01

    Rapid advances in the technology of electronic digital computers and the need for an integrated synthesis approach in developing future rotorcraft programs has led to increased emphasis on system analysis techniques in rotorcraft design. The task in systems analysis is to deal with complex, interdependent, and conflicting requirements in a structured manner so rational and objective decisions can be made. Whether the results are wisdom or rubbish depends upon the validity and sometimes more importantly, the consistency of the inputs, the correctness of the analysis, and a sensible choice of measures of effectiveness to draw conclusions. In rotorcraft design this means combining design requirements, technology assessment, sensitivity analysis and reviews techniques currently in use by NASA and Army organizations in developing research programs and vehicle specifications for rotorcraft. These procedures span simple graphical approaches to comprehensive analysis on large mainframe computers. Examples of recent applications to military and civil missions are highlighted.

  2. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  3. Random-Effects Models for Meta-Analytic Structural Equation Modeling: Review, Issues, and Illustrations

    ERIC Educational Resources Information Center

    Cheung, Mike W.-L.; Cheung, Shu Fai

    2016-01-01

    Meta-analytic structural equation modeling (MASEM) combines the techniques of meta-analysis and structural equation modeling for the purpose of synthesizing correlation or covariance matrices and fitting structural equation models on the pooled correlation or covariance matrix. Both fixed-effects and random-effects models can be defined in MASEM.…

  4. Mixed strategies for energy conservation and alternative energy utilization (solar) in buildings. Final report. Volume II. Detailed results. [New York, Atlanta, Omaha, and Albuquerque

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1977-06-01

    The mixed-strategy analysis was a tradeoff analysis between energy-conservation methods and an alternative energy source (solar) considering technical and economic benefits. The objective of the analysis was to develop guidelines for: reducing energy requirements; reducing conventional fuel use; and identifying economic alternatives for building owners. The analysis was done with a solar system in place. This makes the study unique in that it is determining the interaction of energy conservation with a solar system. The study, therefore, established guidelines as to how to minimize capital investment while reducing the conventional fuel consumption through either a larger solar system or anmore » energy-conserving technique. To focus the scope of energy-conservation techniques and alternative energy sources considered, five building types (house, apartment buildings, commercial buildings, schools, and office buildings) were selected. Finally, the lists of energy-conservation techniques and alternative energy sources were reduced to lists of manageable size by using technical attributes to select the best candidates for further study. The resultant energy-conservation techniques were described in detail and installed costs determined. The alternative energy source reduced to solar. Building construction characteristics were defined for each building for each of four geographic regions of the country. A mixed strategy consisting of an energy-conservation technique and solar heating/hot water/cooling system was analyzed, using computer simulation to determine the interaction between energy conservation and the solar system. Finally, using FEA fuel-price scenarios and installed costs for the solar system and energy conservation techniques, an economic analysis was performed to determine the cost effectiveness of the combination. (MCW)« less

  5. Assessing the Effectiveness of Public Research Universities: Using NSF/NCES Data and Data Envelopment Analysis Technique. AIR 2000 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Zheng, Henry Y.; Stewart, Alice A.

    This study explores data envelopment analysis (DEA) as a tool for assessing and benchmarking the performance of public research universities. Using of national databases such as those conducted by the National Science Foundation and the National Center for Education Statistics, DEA analysis was conducted of the research and instructional outcomes…

  6. Calculations for interpretation of solar vector magnetograph data. [sunspots - spectrum analysis/data correlation

    NASA Technical Reports Server (NTRS)

    Dunn, A. R.

    1975-01-01

    Computer techniques for data analysis of sunspot observations are presented. Photographic spectra were converted to digital form and analyzed. Methods of determining magnetic field strengths, i.e., the Zeeman effect, are discussed. Errors originating with telescope equipment and the magnetograph are treated. Flow charts of test programs and procedures of the data analysis are shown.

  7. Effects of synthesis techniques on chemical composition, microstructure and dielectric properties of Mg-doped calcium titanate

    NASA Astrophysics Data System (ADS)

    Jongprateep, Oratai; Sato, Nicha

    2018-04-01

    Calcium titanate (CaTiO3) has been recognized as a material for fabrication of dielectric components, owing to its moderate dielectric constant and excellent microwave response. Enhancement of dielectric properties of the material can be achieved through doping, compositional and microstructural control. This study, therefore, aimed at investigating effects of powder synthesis techniques on compositions, microstructure, and dielectric properties of Mg-doped CaTiO3. Solution combustion and solid-state reaction were powder synthesis techniques employed in preparation of undoped CaTiO3 and CaTiO3 doped with 5-20 at% Mg. Compositional analysis revealed that powder synthesis techniques did not exhibit a significant effect on formation of secondary phases. When Mg concentration did not exceed 5 at%, the powders prepared by both techniques contained only a single phase. An increase of MgO secondary phase was observed as Mg concentrations increased from 10 to 20 at%. Experimental results, on the contrary, revealed that powder synthesis techniques contributed to significant differences in microstructure. Solution combustion technique produced powders with finer particle sizes, which consequently led to finer grain sizes and density enhancement. High-density specimens with fine microstructure generally exhibit improved dielectric properties. Dielectric measurements revealed that dielectric constants of all samples ranged between 231 and 327 at 1 MHz, and that superior dielectric constants were observed in samples prepared by the solution combustion technique.

  8. Techniques for assessing the socio-economic effects of vehicle mileage fees.

    DOT National Transportation Integrated Search

    2008-06-01

    The purpose of this study was to develop tools for assessing the distributional effects of alternative highway user fees for light vehicles : in Oregon. The analysis focused on a change from the current gasoline tax to a VMT fee structure for collect...

  9. Estimating Interaction Effects With Incomplete Predictor Variables

    PubMed Central

    Enders, Craig K.; Baraldi, Amanda N.; Cham, Heining

    2014-01-01

    The existing missing data literature does not provide a clear prescription for estimating interaction effects with missing data, particularly when the interaction involves a pair of continuous variables. In this article, we describe maximum likelihood and multiple imputation procedures for this common analysis problem. We outline 3 latent variable model specifications for interaction analyses with missing data. These models apply procedures from the latent variable interaction literature to analyses with a single indicator per construct (e.g., a regression analysis with scale scores). We also discuss multiple imputation for interaction effects, emphasizing an approach that applies standard imputation procedures to the product of 2 raw score predictors. We thoroughly describe the process of probing interaction effects with maximum likelihood and multiple imputation. For both missing data handling techniques, we outline centering and transformation strategies that researchers can implement in popular software packages, and we use a series of real data analyses to illustrate these methods. Finally, we use computer simulations to evaluate the performance of the proposed techniques. PMID:24707955

  10. Evaluation of the matrix effect on gas chromatography--mass spectrometry with carrier gas containing ethylene glycol as an analyte protectant.

    PubMed

    Fujiyoshi, Tomoharu; Ikami, Takahito; Sato, Takashi; Kikukawa, Koji; Kobayashi, Masato; Ito, Hiroshi; Yamamoto, Atsushi

    2016-02-19

    The consequences of matrix effects in GC are a major issue of concern in pesticide residue analysis. The aim of this study was to evaluate the applicability of an analyte protectant generator in pesticide residue analysis using a GC-MS system. The technique is based on continuous introduction of ethylene glycol into the carrier gas. Ethylene glycol as an analyte protectant effectively compensated the matrix effects in agricultural product extracts. All peak intensities were increased by this technique without affecting the GC-MS performance. Calibration curves for ethylene glycol in the GC-MS system with various degrees of pollution were compared and similar response enhancements were observed. This result suggests a convenient multi-residue GC-MS method using an analyte protectant generator instead of the conventional compensation method for matrix-induced response enhancement adding the mixture of analyte protectants into both neat and sample solutions. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Identification and description of the momentum effect in studies of learning: An abstract science concept

    NASA Astrophysics Data System (ADS)

    Kwon, Jae-Sool; Mayer, Victor J.

    Several studies of the validity of the intensive time series design have revealed a post-intervention increase in the level of achievement data. This so called momentum effect has not been demonstrated through the application of an appropriate analysis technique. The purpose of this study was to identify and apply a technique that would adequately represent and describe such an effect if indeed it does occur, and to use that technique to study the momentum effect as it is observed in several data sets on the learning of the concept of plate tectonics. Subsequent to trials of several different analyses, a segmented straight line regression analysis was chosen and used on three different data sets. Each set revealed similar patterns of inflection points between lines with similar time intervals between inflections for those data from students with formal cognitive tendencies. These results seem to indicate that this method will indeed be useful in representing and identifying the presence and duration of the momentum effect in time series data on achievement. Since the momentum effect could be described in each of the data sets and since its presence seems a function of similar circumstances, support is given for its presence in the learning of abstract scientific concepts for formal cognitive tendency students. The results indicate that the duration of the momentum effect is related to the level of student understanding tested and the cognitive level of the learners.

  12. Analysis and synthesis of laughter

    NASA Astrophysics Data System (ADS)

    Sundaram, Shiva; Narayanan, Shrikanth

    2004-10-01

    There is much enthusiasm in the text-to-speech community for synthesis of emotional and natural speech. One idea being proposed is to include emotion dependent paralinguistic cues during synthesis to convey emotions effectively. This requires modeling and synthesis techniques of various cues for different emotions. Motivated by this, a technique to synthesize human laughter is proposed. Laughter is a complex mechanism of expression and has high variability in terms of types and usage in human-human communication. People have their own characteristic way of laughing. Laughter can be seen as a controlled/uncontrolled physiological process of a person resulting from an initial excitation in context. A parametric model based on damped simple harmonic motion to effectively capture these diversities and also maintain the individuals characteristics is developed here. Limited laughter/speech data from actual humans and synthesis ease are the constraints imposed on the accuracy of the model. Analysis techniques are also developed to determine the parameters of the model for a given individual or laughter type. Finally, the effectiveness of the model to capture the individual characteristics and naturalness compared to real human laughter has been analyzed. Through this the factors involved in individual human laughter and their importance can be better understood.

  13. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  14. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  15. Evaluation of SLAR and thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1980-01-01

    The column normalizing technique was used to adjust the data for variations in the amplitude of the signal due to look angle effects with respect to solar zenith angle along the scan lines (i.e., across columns). Evaluation of the data set containing the geometric and radiometric adjustments, indicates that the data set should be satisfactory for further processing and analysis. Software was developed for degrading the spatial resolution of the aircraft data to produce a total of four data sets for further analysis. The quality of LANDSAT 2 CCT data for the test site is good for channels four, five, and six. Channel seven was not present on the tape. The data received were reformatted and analysis of the test site area was initiated.

  16. Evaluation of different screw fixation techniques and screw diameters in sagittal split ramus osteotomy: finite element analysis method.

    PubMed

    Sindel, A; Demiralp, S; Colok, G

    2014-09-01

    Sagittal split ramus osteotomy (SSRO) is used for correction of numerous congenital or acquired deformities in facial region. Several techniques have been developed and used to maintain fixation and stabilisation following SSRO application. In this study, the effects of the insertion formations of the bicortical different sized screws to the stresses generated by forces were studied. Three-dimensional finite elements analysis (FEA) and static linear analysis methods were used to investigate difference which would occur in terms of forces effecting onto the screws and transmitted to bone between different application areas. No significant difference was found between 1·5- and 2-mm screws used in SSRO fixation. Besides, it was found that 'inverted L' application was more successful compared to the others and that was followed by 'L' and 'linear' formations which showed close rates to each other. Few studies have investigated the effect of thickness and application areas of bicortical screws. This study was performed on both advanced and regressed jaws positions. © 2014 John Wiley & Sons Ltd.

  17. Innovative acoustic techniques for studying new materials and new developments in solid state physics

    NASA Astrophysics Data System (ADS)

    Maynard, Julian D.

    1994-06-01

    The goals of this project involve the use of innovative acoustic techniques to study new materials and new developments in solid state physics. Major accomplishments include (a) the preparation and publication of a number of papers and book chapters, (b) the measurement and new analysis of more samples of aluminum quasicrystal and its cubic approximant to eliminate the possibility of sample artifacts, (c) the use of resonant ultrasound to measure acoustic attenuation and determine the effects of heat treatment on ceramics, (d) the extension of our technique for measuring even lower (possibly the lowest) infrared optical absorption coefficient, and (e) the measurement of the effects of disorder on the propagation of a nonlinear pulse, and (f) the observation of statistical effects in measurements of individual bond breaking events in fracture.

  18. Metabolomics study on the hepatoprotective effect of scoparone using ultra-performance liquid chromatography/electrospray ionization quadruple time-of-flight mass spectrometry.

    PubMed

    Zhang, Aihua; Sun, Hui; Dou, Shengshan; Sun, Wenjun; Wu, Xiuhong; Wang, Ping; Wang, Xijun

    2013-01-07

    Scoparone is an important constituent of Yinchenhao (Artemisia annua L.), a famous medicinal plant, and displayed bright prospects in the prevention and therapy of liver injury. However, the precise molecular mechanism of hepatoprotective effects has not been comprehensively explored. Here, metabolomics techniques are the comprehensive assessment of endogenous metabolites in a biological system and may provide additional insight into the mechanisms. The present investigation was designed to assess the effects and possible mechanisms of scoparone against carbon tetrachloride-induced liver injury. Ultra-performance liquid chromatography/electrospray ionization quadruple time-of-flight mass spectrometry (UPLC/ESI-Q-TOF/MS) combined with pattern recognition approaches including principal component analysis (PCA) and partial least squares-discriminant analysis (PLS-DA) were integrated to discover differentiating metabolites. Results indicate five ions in the positive mode as differentiating metabolites. Functional pathway analysis revealed that the alterations in these metabolites were associated with primary bile acid biosynthesis, pyrimidine metabolism. Of note, scoparone has a potential pharmacological effect through regulating multiple perturbed pathways to the normal state. Our findings also showed that the robust metabolomics techniques are promising for getting biomarkers and clarifying mechanisms of disease, highlighting insights into drug discovery.

  19. Focus characterization at an X-ray free-electron laser by coherent scattering and speckle analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sikorski, Marcin; Song, Sanghoon; Schropp, Andreas

    2015-04-14

    X-ray focus optimization and characterization based on coherent scattering and quantitative speckle size measurements was demonstrated at the Linac Coherent Light Source. Its performance as a single-pulse free-electron laser beam diagnostic was tested for two typical focusing configurations. The results derived from the speckle size/shape analysis show the effectiveness of this technique in finding the focus' location, size and shape. In addition, its single-pulse compatibility enables users to capture pulse-to-pulse fluctuations in focus properties compared with other techniques that require scanning and averaging.

  20. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  1. Simulating muscular thin films using thermal contraction capabilities in finite element analysis tools.

    PubMed

    Webster, Victoria A; Nieto, Santiago G; Grosberg, Anna; Akkus, Ozan; Chiel, Hillel J; Quinn, Roger D

    2016-10-01

    In this study, new techniques for approximating the contractile properties of cells in biohybrid devices using Finite Element Analysis (FEA) have been investigated. Many current techniques for modeling biohybrid devices use individual cell forces to simulate the cellular contraction. However, such techniques result in long simulation runtimes. In this study we investigated the effect of the use of thermal contraction on simulation runtime. The thermal contraction model was significantly faster than models using individual cell forces, making it beneficial for rapidly designing or optimizing devices. Three techniques, Stoney׳s Approximation, a Modified Stoney׳s Approximation, and a Thermostat Model, were explored for calibrating thermal expansion/contraction parameters (TECPs) needed to simulate cellular contraction using thermal contraction. The TECP values were calibrated by using published data on the deflections of muscular thin films (MTFs). Using these techniques, TECP values that suitably approximate experimental deflections can be determined by using experimental data obtained from cardiomyocyte MTFs. Furthermore, a sensitivity analysis was performed in order to investigate the contribution of individual variables, such as elastic modulus and layer thickness, to the final calibrated TECP for each calibration technique. Additionally, the TECP values are applicable to other types of biohybrid devices. Two non-MTF models were simulated based on devices reported in the existing literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. The role of chemometrics in single and sequential extraction assays: a review. Part II. Cluster analysis, multiple linear regression, mixture resolution, experimental design and other techniques.

    PubMed

    Giacomino, Agnese; Abollino, Ornella; Malandrino, Mery; Mentasti, Edoardo

    2011-03-04

    Single and sequential extraction procedures are used for studying element mobility and availability in solid matrices, like soils, sediments, sludge, and airborne particulate matter. In the first part of this review we reported an overview on these procedures and described the applications of chemometric uni- and bivariate techniques and of multivariate pattern recognition techniques based on variable reduction to the experimental results obtained. The second part of the review deals with the use of chemometrics not only for the visualization and interpretation of data, but also for the investigation of the effects of experimental conditions on the response, the optimization of their values and the calculation of element fractionation. We will describe the principles of the multivariate chemometric techniques considered, the aims for which they were applied and the key findings obtained. The following topics will be critically addressed: pattern recognition by cluster analysis (CA), linear discriminant analysis (LDA) and other less common techniques; modelling by multiple linear regression (MLR); investigation of spatial distribution of variables by geostatistics; calculation of fractionation patterns by a mixture resolution method (Chemometric Identification of Substrates and Element Distributions, CISED); optimization and characterization of extraction procedures by experimental design; other multivariate techniques less commonly applied. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. The Scientific Status of Projective Techniques.

    PubMed

    Lilienfeld, S O; Wood, J M; Garb, H N

    2000-11-01

    Although projective techniques continue to be widely used in clinical and forensic settings, their scientific status remains highly controversial. In this monograph, we review the current state of the literature concerning the psychometric properties (norms, reliability, validity, incremental validity, treatment utility) of three major projective instruments: Rorschach Inkblot Test, Thematic Apperception Test (TAT), and human figure drawings. We conclude that there is empirical support for the validity of a small number of indexes derived from the Rorschach and TAT. However, the substantial majority of Rorschach and TAT indexes are not empirically supported. The validity evidence for human figure drawings is even more limited. With a few exceptions, projective indexes have not consistently demonstrated incremental validity above and beyond other psychometric data. In addition, we summarize the results of a new meta-analysis intended to examine the capacity of these three instruments to detect child sexual abuse. Although some projective instruments were better than chance at detecting child sexual abuse, there were virtually no replicated findings across independent investigative teams. This meta-analysis also provides the first clear evidence of substantial file drawer effects in the projectives literature, as the effect sizes from published studies markedly exceeded those from unpublished studies. We conclude with recommendations regarding the (a) construction of projective techniques with adequate validity, (b) forensic and clinical use of projective techniques, and (c) education and training of future psychologists regarding projective techniques. © 2000 Association for Psychological Science.

  4. Effect of scene illumination conditions on digital enhancement techniques of multispectral scanner LANDSAT images

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J.; Novo, E. M. L. M.

    1983-01-01

    Two sets of MSS/LANDSAT data with solar elevation ranging from 22 deg to 41 deg were used at the Image-100 System to implement the Eliason et alii technique for extracting the topographic modulation component. An unsupervised cluster analysis was used to obtain an average brightness image for each channel. Analysis of the enhanced imaged shows that the technique for extracting topographic modulation component is more appropriated to MSS data obtained under high sun elevation ngles. Low sun elevation increases the variance of each cluster so that the average brightness doesn't represent its albedo proprties. The topographic modulation component applied to low sun elevation angle damages rather than enhance topographic information. Better results were produced for channels 4 and 5 than for channels 6 and 7.

  5. Nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel and stainless steel alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moore, D.G.; Sorensen, N.R.

    1998-02-01

    This report presents a nondestructive inspection assessment of eddy current and electrochemical analysis to separate inconel alloys from stainless steel alloys as well as an evaluation of cleaning techniques to remove a thermal oxide layer on aircraft exhaust components. The results of this assessment are presented in terms of how effective each technique classifies a known exhaust material. Results indicate that either inspection technique can separate inconel and stainless steel alloys. Based on the experiments conducted, the electrochemical spot test is the optimum for use by airframe and powerplant mechanics. A spot test procedure is proposed for incorporation into themore » Federal Aviation Administration Advisory Circular 65-9A Airframe & Powerplant Mechanic - General Handbook. 3 refs., 70 figs., 7 tabs.« less

  6. Analysis of Variance in Statistical Image Processing

    NASA Astrophysics Data System (ADS)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  7. Improved Technique for Finding Vibration Parameters

    NASA Technical Reports Server (NTRS)

    Andrew, L. V.; Park, C. C.

    1986-01-01

    Filtering and sample manipulation reduce noise effects. Analysis technique improves extraction of vibrational frequencies and damping rates from measurements of vibrations of complicated structure. Structural vibrations measured by accelerometers. Outputs digitized at frequency high enough to cover all modes of interest. Use of method on set of vibrational measurements from Space Shuttle, raised level of coherence from previous values below 50 percent to values between 90 and 99 percent

  8. Splash evaluation of SRB designs

    NASA Technical Reports Server (NTRS)

    Counter, D. N.

    1974-01-01

    A technique is developed to optimize the shuttle solid rocket booster (SRB) design for water impact loads. The SRB is dropped by parachute and recovered at sea for reuse. Loads experienced at water impact are design critical. The probability of each water impact load is determined using a Monte Carlo technique and an aerodynamic analysis of the SRB parachute system. Meteorological effects are included and four configurations are evaluated.

  9. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  10. Evaluation of publicly available documents to trace chiropractic technique systems that advocate radiography for subluxation analysis: a proposed genealogy.

    PubMed

    Young, Kenneth J

    2014-12-01

    The purpose of this study was to evaluate publicly available information of chiropractic technique systems that advocate radiography for subluxation detection to identify links between chiropractic technique systems and to describe claims made of the health effects of the osseous misalignment component of the chiropractic subluxation and radiographic paradigms. The Internet and publicly available documents were searched for information representing chiropractic technique systems that advocate radiography for subluxation detection. Key phrases including chiropractic, x-ray, radiography, and technique were identified from a Google search between April 2013 and March 2014. Phrases in Web sites and public documents were examined for any information about origins and potential links between these techniques, including the type of connection to BJ Palmer, who was the first chiropractor to advocate radiography for subluxation detection. Quotes were gathered to identify claims of health effects from osseous misalignment (subluxation) and paradigms of radiography. Techniques were grouped by region of the spine and how they could be traced back to B.J Palmer. A genealogy model and summary table of information on each technique were created. Patterns in year of origination and radiographic paradigms were noted, and percentages were calculated on elements of the techniques' characteristics in comparison to the entire group. Twenty-three techniques were identified on the Internet: 6 full spine, 17 upper cervical, and 2 techniques generating other lineage. Most of the upper cervical techniques (14/16) traced their origins to a time when the Palmer School was teaching upper cervical technique, and all the full spine techniques (6/6) originated before or after this phase. All the technique systems' documents attributed broad health effects to their methods. Many (21/23) of the techniques used spinal realignment on radiographs as one of their outcome measures. Chiropractic technique systems in this study (ie, those that advocate for radiography for subluxation misalignment detection) seem to be closely related by descent, their claims of a variety of health effects associated with chiropractic subluxation, and their radiographic paradigms.

  11. Analysis of flexible aircraft longitudinal dynamics and handling qualities. Volume 2: Data

    NASA Technical Reports Server (NTRS)

    Waszak, M. R.; Schmidt, D. K.

    1985-01-01

    Two analysis methods are applied to a family of flexible aircraft in order to investigate how and when structural (especially dynamic aeroelastic) effects affect the dynamic characteristics of aircraft. The first type of analysis is an open loop modal analysis technique. This method considers the effect of modal residue magnitudes on determining vehicle handling qualities. The second method is a pilot in the loop analysis procedure that considers several closed loop system characteristics. Both analyses indicated that dynamic aeroelastic effects caused a degradation in vehicle tracking performance, based on the evaluation of some simulation results. Volume 2 consists of the presentation of the state variable models of the flexible aircraft configurations used in the analysis applications mode shape plots for the structural modes, numerical results from the modal analysis frequency response plots from the pilot in the loop analysis and a listing of the modal analysis computer program.

  12. BAYESIAN SEMI-BLIND COMPONENT SEPARATION FOR FOREGROUND REMOVAL IN INTERFEROMETRIC 21 cm OBSERVATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Timbie, Peter T.; Bunn, Emory F.

    In this paper, we present a new Bayesian semi-blind approach for foreground removal in observations of the 21 cm signal measured by interferometers. The technique, which we call H i Expectation–Maximization Independent Component Analysis (HIEMICA), is an extension of the Independent Component Analysis technique developed for two-dimensional (2D) cosmic microwave background maps to three-dimensional (3D) 21 cm cosmological signals measured by interferometers. This technique provides a fully Bayesian inference of power spectra and maps and separates the foregrounds from the signal based on the diversity of their power spectra. Relying only on the statistical independence of the components, this approachmore » can jointly estimate the 3D power spectrum of the 21 cm signal, as well as the 2D angular power spectrum and the frequency dependence of each foreground component, without any prior assumptions about the foregrounds. This approach has been tested extensively by applying it to mock data from interferometric 21 cm intensity mapping observations under idealized assumptions of instrumental effects. We also discuss the impact when the noise properties are not known completely. As a first step toward solving the 21 cm power spectrum analysis problem, we compare the semi-blind HIEMICA technique to the commonly used Principal Component Analysis. Under the same idealized circumstances, the proposed technique provides significantly improved recovery of the power spectrum. This technique can be applied in a straightforward manner to all 21 cm interferometric observations, including epoch of reionization measurements, and can be extended to single-dish observations as well.« less

  13. Failure Modes and Effects Analysis (FMEA): A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  14. Are Effective Counselors Made or Born? A Critical Review.

    ERIC Educational Resources Information Center

    DeCsipkes, Robert A.; And Others

    The purpose of this review was to investigate the relationship between counselor characteristics and reports of effectiveness. The theoretical position appears to focus on two opposing views. The humanists emphasize the influence of intuition, genuineness, and spontaneity, while the behaviorists place importance on technique, analysis of…

  15. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  16. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    PubMed

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  17. Investigation of advanced phase-shifting projected fringe profilometry techniques

    NASA Astrophysics Data System (ADS)

    Liu, Hongyu

    1999-11-01

    The phase-shifting projected fringe profilometry (PSPFP) technique is a powerful tool in the profile measurements of rough engineering surfaces. Compared with other competing techniques, this technique is notable for its full-field measurement capacity, system simplicity, high measurement speed, and low environmental vulnerability. The main purpose of this dissertation is to tackle three important problems, which severely limit the capability and the accuracy of the PSPFP technique, with some new approaches. Chapter 1 provides some background information of the PSPFP technique including the measurement principles, basic features, and related techniques is briefly introduced. The objectives and organization of the thesis are also outlined. Chapter 2 gives a theoretical treatment to the absolute PSPFP measurement. The mathematical formulations and basic requirements of the absolute PSPFP measurement and its supporting techniques are discussed in detail. Chapter 3 introduces the experimental verification of the proposed absolute PSPFP technique. Some design details of a prototype system are discussed as supplements to the previous theoretical analysis. Various fundamental experiments performed for concept verification and accuracy evaluation are introduced together with some brief comments. Chapter 4 presents the theoretical study of speckle- induced phase measurement errors. In this analysis, the expression for speckle-induced phase errors is first derived based on the multiplicative noise model of image- plane speckles. The statistics and the system dependence of speckle-induced phase errors are then thoroughly studied through numerical simulations and analytical derivations. Based on the analysis, some suggestions on the system design are given to improve measurement accuracy. Chapter 5 discusses a new technique combating surface reflectivity variations. The formula used for error compensation is first derived based on a simplified model of the detection process. The techniques coping with two major effects of surface reflectivity variations are then introduced. Some fundamental problems in the proposed technique are studied through simulations. Chapter 6 briefly summarizes the major contributions of the current work and provides some suggestions for the future research.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schaeffel, J.A.; Mullinix, B.R.; Ranson, W.F.

    An experimental technique to simulate and evaluate the effects of high concentrations of x-rays resulting from a nuclear detonation on missile structures is presented. Data from 34 tests are included to demonstrate the technique. The effects of variations in the foil thickness, capacitor voltage, and plate thickness on the total impulse and maximum strain in the structure were determined. The experimental technique utilizes a high energy capacitor discharge unit to explode an aluminum foil on the surface of the structure. The structural response is evaluated by optical methods using the grid slope deflection method. The fringe patterns were recorded usingmore » a high-speed framing camera. The data were digitized using an optical comparator with an x-y table. The analysis was performed on a CDC 6600 computer.« less

  19. Nanostructured surfaces for analysis of anticancer drug and cell diagnosis based on electrochemical and SERS tools.

    PubMed

    El-Said, Waleed A; Yoon, Jinho; Choi, Jeong-Woo

    2018-01-01

    Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.

  20. Nanostructured surfaces for analysis of anticancer drug and cell diagnosis based on electrochemical and SERS tools

    NASA Astrophysics Data System (ADS)

    El-Said, Waleed A.; Yoon, Jinho; Choi, Jeong-Woo

    2018-04-01

    Discovering new anticancer drugs and screening their efficacy requires a huge amount of resources and time-consuming processes. The development of fast, sensitive, and nondestructive methods for the in vitro and in vivo detection of anticancer drugs' effects and action mechanisms have been done to reduce the time and resources required to discover new anticancer drugs. For the in vitro and in vivo detection of the efficiency, distribution, and action mechanism of anticancer drugs, the applications of electrochemical techniques such as electrochemical cell chips and optical techniques such as surface-enhanced Raman spectroscopy (SERS) have been developed based on the nanostructured surface. Research focused on electrochemical cell chips and the SERS technique have been reviewed here; electrochemical cell chips based on nanostructured surfaces have been developed for the in vitro detection of cell viability and the evaluation of the effects of anticancer drugs, which showed the high capability to evaluate the cytotoxic effects of several chemicals at low concentrations. SERS technique based on the nanostructured surface have been used as label-free, simple, and nondestructive techniques for the in vitro and in vivo monitoring of the distribution, mechanism, and metabolism of different anticancer drugs at the cellular level. The use of electrochemical cell chips and the SERS technique based on the nanostructured surface should be good tools to detect the effects and action mechanisms of anticancer drugs.

  1. Fenestrated and Chimney Technique for Juxtarenal Aortic Aneurysm: A Systematic Review and Pooled Data Analysis

    PubMed Central

    Li, Yue; Hu, Zhongzhou; Bai, Chujie; Liu, Jie; Zhang, Tao; Ge, Yangyang; Luan, Shaoliang; Guo, Wei

    2016-01-01

    Juxtarenal aortic aneurysms (JAA) account for approximately 15% of abdominal aortic aneurysms. Fenestrated endovascular aneurysm repair (FEVAR) and chimney endovascular aneurysm repair (CH-EVAR) are both effective methods to treat JAAs, but the comparative effectiveness of these treatment modalities is unclear. We searched the PubMed, Medline, Embase, and Cochrane databases to identify English language articles published between January 2005 and September 2013 on management of JAA with fenestrated and chimney techniques to conduct a systematic review to compare outcomes of patients with juxtarenal aortic aneurysm (JAA) treated with the two techniques. We compared nine F-EVAR cohort studies including 542 JAA patients and 8 CH-EVAR cohorts with 158 JAA patients regarding techniques success rates, 30-day mortality, late mortality, endoleak events and secondary intervention rates. The results of this systematic review indicate that both fenestrated and chimney techniques are attractive options for JAAs treatment with encouraging early and mid-term outcomes. PMID:26869488

  2. Yield enhancement with DFM

    NASA Astrophysics Data System (ADS)

    Paek, Seung Weon; Kang, Jae Hyun; Ha, Naya; Kim, Byung-Moo; Jang, Dae-Hyun; Jeon, Junsu; Kim, DaeWook; Chung, Kun Young; Yu, Sung-eun; Park, Joo Hyun; Bae, SangMin; Song, DongSup; Noh, WooYoung; Kim, YoungDuck; Song, HyunSeok; Choi, HungBok; Kim, Kee Sup; Choi, Kyu-Myung; Choi, Woonhyuk; Jeon, JoongWon; Lee, JinWoo; Kim, Ki-Su; Park, SeongHo; Chung, No-Young; Lee, KangDuck; Hong, YoungKi; Kim, BongSeok

    2012-03-01

    A set of design for manufacturing (DFM) techniques have been developed and applied to 45nm, 32nm and 28nm logic process technologies. A noble technology combined a number of potential confliction of DFM techniques into a comprehensive solution. These techniques work in three phases for design optimization and one phase for silicon diagnostics. In the DFM prevention phase, foundation IP such as standard cells, IO, and memory and P&R tech file are optimized. In the DFM solution phase, which happens during ECO step, auto fixing of process weak patterns and advanced RC extraction are performed. In the DFM polishing phase, post-layout tuning is done to improve manufacturability. DFM analysis enables prioritization of random and systematic failures. The DFM technique presented in this paper has been silicon-proven with three successful tape-outs in Samsung 32nm processes; about 5% improvement in yield was achieved without any notable side effects. Visual inspection of silicon also confirmed the positive effect of the DFM techniques.

  3. Bioremediation techniques applied to aqueous media contaminated with mercury.

    PubMed

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  4. Some aspects of optical feedback with cadmium sulfide and related photoconductors. [for extended frequency response

    NASA Technical Reports Server (NTRS)

    Katzberg, S. J.

    1974-01-01

    A primary limitation of many solid state photoconductors used in electro-optical systems is their slow response in converting varying light intensities into electrical signals. An optical feedback technique is presented which can extend the frequency response of systems that use these detectors by orders of magnitude without adversely affecting overall signal-to-noise ratio performance. The technique is analyzed to predict the improvement possible and a system is implemented using cadmium sulfide to demonstrate the effectiveness of the technique and the validity of the analysis.

  5. Revealing the beneficial effect of protease supplementation to high gravity beer fermentations using "-omics" techniques

    PubMed Central

    2011-01-01

    Background Addition of sugar syrups to the basic wort is a popular technique to achieve higher gravity in beer fermentations, but it results in dilution of the free amino nitrogen (FAN) content in the medium. The multicomponent protease enzyme Flavourzyme has beneficial effect on the brewer's yeast fermentation performance during high gravity fermentations as it increases the initial FAN value and results in higher FAN uptake, higher specific growth rate, higher ethanol yield and improved flavour profile. Results In the present study, transcriptome and metabolome analysis were used to elucidate the effect on the addition of the multicomponent protease enzyme Flavourzyme and its influence on the metabolism of the brewer's yeast strain Weihenstephan 34/70. The study underlines the importance of sufficient nitrogen availability during the course of beer fermentation. The applied metabolome and transcriptome analysis allowed mapping the effect of the wort sugar composition on the nitrogen uptake. Conclusion Both the transcriptome and the metabolome analysis revealed that there is a significantly higher impact of protease addition for maltose syrup supplemented fermentations, while addition of glucose syrup to increase the gravity in the wort resulted in increased glucose repression that lead to inhibition of amino acid uptake and hereby inhibited the effect of the protease addition. PMID:21513553

  6. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  7. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis

    PubMed Central

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% –CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons’ preferences and the patients’ conditions. PMID:29445683

  8. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective collaboration of theoreticians, mathematicians and experimentalists of the institute to solve such tasks.« less

  9. On line biomonitors used as a tool for toxicity reduction evaluation of in situ groundwater remediation techniques.

    PubMed

    Küster, Eberhard; Dorusch, Falk; Vogt, Carsten; Weiss, Holger; Altenburger, Rolf

    2004-07-15

    Success of groundwater remediation is typically controlled via snapshot analysis of selected chemical substances or physical parameters. Biological parameters, i.e. ecotoxicological assays, are rarely employed. Hence the aim of the study was to develop a bioassay tool, which allows an on line monitoring of contaminated groundwater, as well as a toxicity reduction evaluation (TRE) of different remediation techniques in parallel and may furthermore be used as an additional tool for process control to supervise remediation techniques in a real time mode. Parallel testing of groundwater remediation techniques was accomplished for short and long time periods, by using the energy dependent luminescence of the bacterium Vibrio fischeri as biological monitoring parameter. One data point every hour for each remediation technique was generated by an automated biomonitor. The bacteria proved to be highly sensitive to the contaminated groundwater and the biomonitor showed a long standing time despite the highly corrosive groundwater present in Bitterfeld, Germany. The bacterial biomonitor is demonstrated to be a valuable tool for remediation success evaluation. Dose response relationships were generated for the six quantitatively dominant groundwater contaminants (2-chlortoluene, 1,2- and 1,4-dichlorobenzene, monochlorobenzene, ethylenbenzene and benzene). The concentrations of individual volatile organic chemicals (VOCs) could not explain the observed effects in the bacteria. An expected mixture toxicity was calculated for the six components using the concept of concentration addition. The calculated EC(50) for the mixture was still one order of magnitude lower than the observed EC(50) of the actual groundwater. The results pointed out that chemical analysis of the six most quantitative substances alone was not able to explain the effects observed with the bacteria. Thus chemical analysis alone may not be an adequate tool for remediation success evaluation in terms of toxicity reduction.

  10. Evaluation of Publicly Available Documents to Trace Chiropractic Technique Systems That Advocate Radiography for Subluxation Analysis: A Proposed Genealogy

    PubMed Central

    Young, Kenneth J.

    2014-01-01

    Objective The purpose of this study was to evaluate publicly available information of chiropractic technique systems that advocate radiography for subluxation detection to identify links between chiropractic technique systems and to describe claims made of the health effects of the osseous misalignment component of the chiropractic subluxation and radiographic paradigms. Methods The Internet and publicly available documents were searched for information representing chiropractic technique systems that advocate radiography for subluxation detection. Key phrases including chiropractic, x-ray, radiography, and technique were identified from a Google search between April 2013 and March 2014. Phrases in Web sites and public documents were examined for any information about origins and potential links between these techniques, including the type of connection to BJ Palmer, who was the first chiropractor to advocate radiography for subluxation detection. Quotes were gathered to identify claims of health effects from osseous misalignment (subluxation) and paradigms of radiography. Techniques were grouped by region of the spine and how they could be traced back to B.J Palmer. A genealogy model and summary table of information on each technique were created. Patterns in year of origination and radiographic paradigms were noted, and percentages were calculated on elements of the techniques’ characteristics in comparison to the entire group. Results Twenty-three techniques were identified on the Internet: 6 full spine, 17 upper cervical, and 2 techniques generating other lineage. Most of the upper cervical techniques (14/16) traced their origins to a time when the Palmer School was teaching upper cervical technique, and all the full spine techniques (6/6) originated before or after this phase. All the technique systems’ documents attributed broad health effects to their methods. Many (21/23) of the techniques used spinal realignment on radiographs as one of their outcome measures. Conclusion Chiropractic technique systems in this study (ie, those that advocate for radiography for subluxation misalignment detection) seem to be closely related by descent, their claims of a variety of health effects associated with chiropractic subluxation, and their radiographic paradigms. PMID:25431540

  11. A comparative critical study between FMEA and FTA risk analysis methods

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Constantinescu, DM

    2017-10-01

    Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.

  12. Conjoint Analysis: A Study of the Effects of Using Person Variables.

    ERIC Educational Resources Information Center

    Fraas, John W.; Newman, Isadore

    Three statistical techniques--conjoint analysis, a multiple linear regression model, and a multiple linear regression model with a surrogate person variable--were used to estimate the relative importance of five university attributes for students in the process of selecting a college. The five attributes include: availability and variety of…

  13. ACCOUNTING FOR ERROR PROPAGATION IN THE DEVELOPMENT OF A LEAF AREA INDEX (LAI) REFERENCE MAP TO ASSESS THE MODIS LAI MODI5A LAI PRODUCT

    EPA Science Inventory

    The ability to effectively use remotely sensed data for environmental spatial analysis is dependent on understanding the underlying procedures and associated variances attributed to the data processing and image analysis technique. Equally important, also, is understanding the er...

  14. The effect of extraction, storage, and analysis techniques on the measurement of airborne endotoxin from a large dairy

    USDA-ARS?s Scientific Manuscript database

    The objective of this study was to fill in additional knowledge gaps with respect to the extraction, storage, and analysis of airborne endotoxin, with a specific focus on samples from a dairy production facility. We utilized polycarbonate filters to collect total airborne endotoxins, sonication as ...

  15. Economic Analysis of Education: A Conceptual Framework. Theoretical Paper No. 68.

    ERIC Educational Resources Information Center

    Rossmiller, Richard A.; Geske, Terry G.

    This paper discusses several concepts and techniques from the areas of systems theory and economic analysis that can be used as tools in an effort to improve the productivity of the educational enterprise. Several studies investigating productivity in education are reviewed, and the analytical problems in conducting cost-effectiveness studies are…

  16. Programmed Instruction in Secondary Education: A Meta-Analysis of the Impact of Class Size on Its Effectiveness.

    ERIC Educational Resources Information Center

    Boden, Andrea; Archwamety, Teara; McFarland, Max

    This review used meta-analytic techniques to integrate findings from 30 independent studies that compared programmed instruction to conventional methods of instruction at the secondary level. The meta-analysis demonstrated that programmed instruction resulted in higher achievement when compared to conventional methods of instruction (average…

  17. The Impact of Teacher Questioning on Creating Interaction in EFL: A Discourse Analysis

    ERIC Educational Resources Information Center

    Al-Zahrani, Mona Yousef; Al-Bargi, Abdullah

    2017-01-01

    This study examines the effect of questions on fostering interaction in English as a Foreign Language (EFL) classrooms. It also seeks to determine the characteristics of questions that promote increased classroom interaction. Data were collected through video recordings of EFL classrooms which were analyzed using Discourse Analysis techniques.…

  18. Separation of cucurbitane triterpenoids from bitter melon drinks and determination of partition coefficients using vortex-assisted dispersive liquid-phase microextraction followed by UHPLC analysis

    USDA-ARS?s Scientific Manuscript database

    A rapid, effective technique applying vortex-assisted liquid–liquid microextraction (VALLME) prior to ultra high performance liquid chromatography-evaporating light scattering detectection/ mass spectroscopy (UHPLC-ELSD/MS) determination was developed for the analysis of four cucurbitane triterpenoi...

  19. Can Smoking Cessation Services Be Better Targeted to Tackle Health Inequalities? Evidence from a Cross-Sectional Study

    ERIC Educational Resources Information Center

    Blackman, Tim

    2008-01-01

    Objective: To investigate how smoking cessation services could be more effectively targeted to tackle socioeconomic inequalities in health. Design: Secondary analysis of data from a household interview survey undertaken for Middlesbrough Council in north east England using the technique of Qualitative Comparative Analysis. Setting: Home-based…

  20. A Meta-Analysis of Previous Research on the Treatment of Hyperactivity. Final Report.

    ERIC Educational Resources Information Center

    White, Karl R.; And Others

    Using meta-analysis techniques, the study sought to identify, integrate, and synthesize the literature from 61 articles which review the efficacy of various treatments for hyperactive children. The major objectives were to determine if drugs can be used effectively with hyperactive children, what child and intervention characteristics covary with…

  1. Treatment of Autism in Young Children: Behavioral Intervention and Applied Behavior Analysis.

    ERIC Educational Resources Information Center

    Jensen, Vanessa K.; Sinclair, Leslie V.

    2002-01-01

    This article discusses the etiology and scope of autism in young children, screening and diagnosis, intervention options, and the use of applied behavior analysis. Supporting evidence of the efficacy of intensive behavioral intervention is cited, and variations in treatments and techniques are reviewed. Barriers to effective services are also…

  2. An image analysis of TLC patterns for quality control of saffron based on soil salinity effect: A strategy for data (pre)-processing.

    PubMed

    Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar

    2018-01-15

    Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Insulation commonality assessment (phase 1). Volume 2: Section 7.0 through 16.0. [evaluation of materials used for spacecraft thermal insulation

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The heat transfer characteristics of various materials used for the thermal insulation of spacecraft are discussed. Techniques for conducting thermal performance analysis, structural performance analysis, and dynamic analysis are described. Processes for producing and finishing the materials are explained. The methods for determining reliability, system safety, materials tests, and design effectiveness are explained.

  4. EOS imaging versus current radiography: A health technology assessment study

    PubMed Central

    Mahboub-Ahari, Alireza; Hajebrahimi, Sakineh; Yusefi, Mahmoud; Velayati, Ashraf

    2016-01-01

    Background: EOS is a 2D/3D muscle skeletal diagnostic imaging system. The device has been developed to produce a high quality 2D, full body radiographs in standing, sitting and squatting positions. Three dimensional images can be reconstructed via sterEOS software. This Health Technology Assessment study aimed to investigate efficacy, effectiveness and cost-effectiveness of new emerged EOS imaging system in comparison with conventional x-ray radiographic techniques. Methods: All cost and outcome data were assessed from Iran's Ministry of Health Perspective. Data for clinical effectiveness was extracted using a rigorous systematic review. As clinical outcomes the rate of x-ray emission and related quality of life were compared with Computed Radiography (CR) and Digital Radiography (DR). Standard costing method was conducted to find related direct medical costs. In order to examine robustness of the calculated Incremental Cost Effectiveness Ratios (ICERs) we used two-way sensitivity analysis. GDP Per capita of Islamic Republic of Iran (2012) adopted as cost-effectiveness threshold. Results: Review of related literature highlighted the lack of rigorous evidence for clinical outcomes. Ultra low dose EOS imaging device is known as a safe intervention because of FDA, CE and CSA certificates. The rate of emitted X-ray was 2 to 18 fold lower for EOS compared to the conventional techniques (p<0.001). The Incremental Cost Effectiveness Ratio for EOS relative to CR calculated $50706 in baseline analysis (the first scenario) and $50714, $9446 respectively for the second and third scenarios. Considering the value of neither $42146 as upper limit, nor the first neither the second scenario could pass the cost-effectiveness threshold for Iran. Conclusion: EOS imaging technique might not be considered as a cost-effective intervention in routine practice of health system, especially within in-patient wards. Scenario analysis shows that, only in an optimum condition such as lower assembling costs and higher utilization rates, the device can be recruited for research and therapeutic purposes in pediatric orthopedic centers. PMID:27390701

  5. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  6. Wavelet Analyses of Oil Prices, USD Variations and Impact on Logistics

    NASA Astrophysics Data System (ADS)

    Melek, M.; Tokgozlu, A.; Aslan, Z.

    2009-07-01

    This paper is related with temporal variations of historical oil prices and Dollar and Euro in Turkey. Daily data based on OECD and Central Bank of Turkey records beginning from 1946 has been considered. 1D-continuous wavelets and wavelet packets analysis techniques have been applied on data. Wavelet techniques help to detect abrupt changing's, increasing and decreasing trends of data. Estimation of variables has been presented by using linear regression estimation techniques. The results of this study have been compared with the small and large scale effects. Transportation costs of track show a similar variation with fuel prices. The second part of the paper is related with estimation of imports, exports, costs, total number of vehicles and annual variations by considering temporal variation of oil prices and Dollar currency in Turkey. Wavelet techniques offer a user friendly methodology to interpret some local effects on increasing trend of imports and exports data.

  7. Maternal and neonatal effects of methoxyflurane, nitrous oxide and lumbar epidural anaesthesia for Caesarean section.

    PubMed

    Palahniuk, R J; Scatliff, J; Biehl, D; Wiebe, H; Sankaran, K

    1977-09-01

    General anaesthetic techniques continue to be used for Caesarean section despite the possible increased incidence of foetal acidosis and neonatal depression. Two techniques of general anaesthesia (methoxyflurane-oxygen and nitrous oxide-oxygen) and lumbar epidural anaesthesia were compared in 37 patients under-going elective Caesarean section. Apgar scores at birth were similar in all three groups. Neurophysiological testing of the neonates at six hours and twenty-four hours of age revealed a superiority for the methoxyflurane-oxygen and lumbar epidural techniques, although the babies in the epidural group tended to be hypotonic. Cord blood gas analysis showed the babies in the methoxyflurane group to have a higher PaO2 with less metabolic acidosis than the babies from the other two groups. The maternal effects of the three anaesthetic techniques were similar, with only a small rise in serum fluroide levels noted in the methoxyflurane group.

  8. Effect Size in Single-Case Research: A Review of Nine Nonoverlap Techniques

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.; Davis, John L.

    2011-01-01

    With rapid advances in the analysis of data from single-case research designs, the various behavior-change indices, that is, effect sizes, can be confusing. To reduce this confusion, nine effect-size indices are described and compared. Each of these indices examines data nonoverlap between phases. Similarities and differences, both conceptual and…

  9. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  10. Practical aspects of a maximum likelihood estimation method to extract stability and control derivatives from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1976-01-01

    A maximum likelihood estimation method was applied to flight data and procedures to facilitate the routine analysis of a large amount of flight data were described. Techniques that can be used to obtain stability and control derivatives from aircraft maneuvers that are less than ideal for this purpose are described. The techniques involve detecting and correcting the effects of dependent or nearly dependent variables, structural vibration, data drift, inadequate instrumentation, and difficulties with the data acquisition system and the mathematical model. The use of uncertainty levels and multiple maneuver analysis also proved to be useful in improving the quality of the estimated coefficients. The procedures used for editing the data and for overall analysis are also discussed.

  11. Characterisation of the PXIE Allison-type emittance scanner

    DOE PAGES

    D'Arcy, R.; Alvarez, M.; Gaynier, J.; ...

    2016-01-26

    An Allison-type emittance scanner has been designed for PXIE at FNAL with the goal of providing fast and accurate phase space reconstruction. The device has been modified from previous LBNL/SNS designs to operate in both pulsed and DC modes with the addition of water-cooled front slits. Extensive calibration techniques and error analysis allowed confinement of uncertainty to the <5% level (with known caveats). With a 16-bit, 1 MHz electronics scheme the device is able to analyse a pulse with a resolution of 1 μs, allowing for analysis of neutralisation effects. As a result, this paper describes a detailed breakdown ofmore » the R&D, as well as post-run analysis techniques.« less

  12. Polarimetric Thomson scattering for high Te fusion plasmas

    NASA Astrophysics Data System (ADS)

    Giudicotti, L.

    2017-11-01

    Polarimetric Thomson scattering (TS) is a technique for the analysis of TS spectra in which the electron temperature Te is determined from the depolarization of the scattered radiation, a relativistic effect noticeable only in very hot (Te >= 10 keV) fusion plasmas. It has been proposed as a complementary technique to supplement the conventional spectral analysis in the ITER CPTS (Core Plasma Thomson Scattering) system for measurements in high Te, low ne plasma conditions. In this paper we review the characteristics of the depolarized TS radiation with special emphasis to the conditions of the ITER CPTS system and we describe a possible implementation of this diagnostic method suitable to significantly improve the performances of the conventional TS spectral analysis in the high Te range.

  13. Quantitative assessment of human motion using video motion analysis

    NASA Technical Reports Server (NTRS)

    Probe, John D.

    1990-01-01

    In the study of the dynamics and kinematics of the human body, a wide variety of technologies was developed. Photogrammetric techniques are well documented and are known to provide reliable positional data from recorded images. Often these techniques are used in conjunction with cinematography and videography for analysis of planar motion, and to a lesser degree three-dimensional motion. Cinematography has been the most widely used medium for movement analysis. Excessive operating costs and the lag time required for film development coupled with recent advances in video technology have allowed video based motion analysis systems to emerge as a cost effective method of collecting and analyzing human movement. The Anthropometric and Biomechanics Lab at Johnson Space Center utilizes the video based Ariel Performance Analysis System to develop data on shirt-sleeved and space-suited human performance in order to plan efficient on orbit intravehicular and extravehicular activities. The system is described.

  14. Infrared Analysis Using Tissue Paper.

    ERIC Educational Resources Information Center

    Owen, Noel L.; Wood, Steven G.

    1987-01-01

    Described is a quick, easy, and cheap, but effective method of obtaining infrared spectra of solids and nonvolatile liquids by Fourier transform infrared spectroscopy. The technique uses tissue paper as a support matrix. (RH)

  15. Measurement of ultrafast optical Kerr effect of Ge-Sb-Se chalcogenide slab waveguides by the beam self-trapping technique

    NASA Astrophysics Data System (ADS)

    Kuriakose, Tintu; Baudet, Emeline; Halenkovič, Tomáš; Elsawy, Mahmoud M. R.; Němec, Petr; Nazabal, Virginie; Renversez, Gilles; Chauvet, Mathieu

    2017-11-01

    We present a reliable and original experimental technique based on the analysis of beam self-trapping to measure ultrafast optical nonlinearities in planar waveguides. The technique is applied to the characterization of Ge-Sb-Se chalcogenide films that allow Kerr induced self-focusing and soliton formation. Linear and nonlinear optical constants of three different chalcogenide waveguides are studied at 1200 and 1550 nm in femtosecond regime. Waveguide propagation loss and two photon absorption coefficients are determined by transmission analysis. Beam broadening and narrowing results are compared with simulations of the nonlinear Schrödinger equation solved by BPM method to deduce the Kerr n2 coefficients. Kerr optical nonlinearities obtained by our original technique compare favorably with the values obtained by Z-scan technique. Nonlinear refractive index as high as (69 ± 11) × 10-18m2 / W is measured in Ge12.5Sb25Se62.5 at 1200 nm with low nonlinear absorption and low propagation losses which reveals the great characteristics of our waveguides for ultrafast all optical switching and integrated photonic devices.

  16. Cocrystal screening of hydroxybenzamides with benzoic acid derivatives: a comparative study of thermal and solution-based methods.

    PubMed

    Manin, Alex N; Voronin, Alexander P; Drozd, Ksenia V; Manin, Nikolay G; Bauer-Brandl, Annette; Perlovich, German L

    2014-12-18

    The main problem occurring at the early stages of cocrystal search is the choice of an effective screening technique. Among the most popular techniques of obtaining cocrystals are crystallization from solution, crystallization from melt and solvent-drop grinding. This paper represents a comparative analysis of the following screening techniques: DSC cocrystal screening method, thermal microscopy and saturation temperature method. The efficiency of different techniques of cocrystal screening was checked in 18 systems. Benzamide and benzoic acid derivatives were chosen as model systems due to their ability to form acid-amide supramolecular heterosynthon. The screening has confirmed the formation of 6 new cocrystals. The screening by the saturation temperature method has the highest screen-out rate but the smallest range of application. DSC screening has a satisfactory accuracy and allows screening over a short time. Thermal microscopy is most efficient as an additional technique used to interpret ambiguous DSC screening results. The study also included an analysis of the influence of solvent type and component solubility on cocrystal formation. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Quantitative CT: technique dependence of volume estimation on pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2012-03-01

    Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.

  18. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  19. Analytical method for predicting the pressure distribution about a nacelle at transonic speeds

    NASA Technical Reports Server (NTRS)

    Keith, J. S.; Ferguson, D. R.; Merkle, C. L.; Heck, P. H.; Lahti, D. J.

    1973-01-01

    The formulation and development of a computer analysis for the calculation of streamlines and pressure distributions around two-dimensional (planar and axisymmetric) isolated nacelles at transonic speeds are described. The computerized flow field analysis is designed to predict the transonic flow around long and short high-bypass-ratio fan duct nacelles with inlet flows and with exhaust flows having appropriate aerothermodynamic properties. The flow field boundaries are located as far upstream and downstream as necessary to obtain minimum disturbances at the boundary. The far-field lateral flow field boundary is analytically defined to exactly represent free-flight conditions or solid wind tunnel wall effects. The inviscid solution technique is based on a Streamtube Curvature Analysis. The computer program utilizes an automatic grid refinement procedure and solves the flow field equations with a matrix relaxation technique. The boundary layer displacement effects and the onset of turbulent separation are included, based on the compressible turbulent boundary layer solution method of Stratford and Beavers and on the turbulent separation prediction method of Stratford.

  20. Social marketing and public health intervention.

    PubMed

    Lefebvre, R C; Flora, J A

    1988-01-01

    The rapid proliferation of community-based health education programs has out-paced the knowledge base of behavior change strategies that are appropriate and effective for public health interventions. However, experiences from a variety of large-scale studies suggest that principles and techniques of social marketing may help bridge this gap. This article discusses eight essential aspects of the social marketing process: the use of a consumer orientation to develop and market intervention techniques, exchange theory as a model from which to conceptualize service delivery and program participation, audience analysis and segmentation strategies, the use of formative research in program design and pretesting of intervention materials, channel analysis for devising distribution systems and promotional campaigns, employment of the "marketing mix" concept in intervention planning and implementation, development of a process tracking system, and a management process of problem analysis, planning, implementation, feedback and control functions. Attention to such variables could result in more cost-effective programs that reach larger numbers of the target audience.

  1. The Role of Training Providers in Manpower Planning.

    ERIC Educational Resources Information Center

    Gray, Lynton

    1993-01-01

    Research in Nigeria and Thailand is used to demonstrate that, where vocational training is cost effective (graduates get appropriate jobs), links with employers are closer than in other labor markets. Techniques such as reverse tracer studies, labor market signaling, and skills analysis can be used to improve training effectiveness. (SK)

  2. Effectiveness of Web-Based Psychological Interventions for Depression: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cowpertwait, Louise; Clarke, Dave

    2013-01-01

    Web-based psychological interventions aim to make psychological treatments more accessible and minimize clinician input, but their effectiveness requires further examination. The purposes of the present study are to evaluate the outcomes of web-based interventions for treating depressed adults using meta-analytic techniques, and to examine…

  3. On the Bias-Amplifying Effect of Near Instruments in Observational Studies

    ERIC Educational Resources Information Center

    Steiner, Peter M.; Kim, Yongnam

    2014-01-01

    In contrast to randomized experiments, the estimation of unbiased treatment effects from observational data requires an analysis that conditions on all confounding covariates. Conditioning on covariates can be done via standard parametric regression techniques or nonparametric matching like propensity score (PS) matching. The regression or…

  4. The effects of impact fees in urban form and congestion in Florida for period 4/1/2010 to 11/30/2011.

    DOT National Transportation Integrated Search

    2011-11-01

    This study analyzes the effect of impact fees in urban form and congestion through a combination of methods including econometric analysis, GIS techniques, and interviews with planning officials. The results show that there is some evidence that impa...

  5. PERIPHYTON AND SEDIMENT BIOASSESSMENT AS INDICATORS OF THE EFFECT OF A COASTAL PULP MILL WASTEWATER

    EPA Science Inventory

    A two year study was conducted near Port St. Joe, Florida, in a coastal transportation canal and bay receiving combined municipal and pulp mill wastewater. The objective of the study was to determine the effectiveness of periphyton analysis techniques and sediment toxicity as ind...

  6. Making Definitions Explicit and Capturing Evaluation Policies.

    ERIC Educational Resources Information Center

    Houston, Samuel R.

    Judgment ANalysis (JAN) is described as a technique for identifying the rating policies that exist within a group of judges. Studies are presented in which JAN has been used in evaluating teacher effectiveness by capturing both student and faculty policies of teacher effectiveness at the University of Northern Colorado. In addition, research…

  7. EFFECTS OF HEAVY METALS IN SEDIMENTS OF THE MACROINVERTEBRATE COMMUNITY IN THE SHORT CREEK/EMPIRE LAKE AQUATIC SYSTEM, CHEROKEE COUNTY, KANSAS: A RECOMMENDATION FOR SITE-SPECIFIC CRITERIA.

    EPA Science Inventory

    The study uses statistical analysis techniques to determine the effects of four heavy metals (cadmium, lead, manganese, and zinc) on the macroinvertebrate community using the data collected in the fall 1987.

  8. Digression and Value Concatenation to Enable Privacy-Preserving Regression.

    PubMed

    Li, Xiao-Bai; Sarkar, Sumit

    2014-09-01

    Regression techniques can be used not only for legitimate data analysis, but also to infer private information about individuals. In this paper, we demonstrate that regression trees, a popular data-analysis and data-mining technique, can be used to effectively reveal individuals' sensitive data. This problem, which we call a "regression attack," has not been addressed in the data privacy literature, and existing privacy-preserving techniques are not appropriate in coping with this problem. We propose a new approach to counter regression attacks. To protect against privacy disclosure, our approach introduces a novel measure, called digression , which assesses the sensitive value disclosure risk in the process of building a regression tree model. Specifically, we develop an algorithm that uses the measure for pruning the tree to limit disclosure of sensitive data. We also propose a dynamic value-concatenation method for anonymizing data, which better preserves data utility than a user-defined generalization scheme commonly used in existing approaches. Our approach can be used for anonymizing both numeric and categorical data. An experimental study is conducted using real-world financial, economic and healthcare data. The results of the experiments demonstrate that the proposed approach is very effective in protecting data privacy while preserving data quality for research and analysis.

  9. Performance Analysis of MIMO Relay Network via Propagation Measurement in L-Shaped Corridor Environment

    NASA Astrophysics Data System (ADS)

    Lertwiram, Namzilp; Tran, Gia Khanh; Mizutani, Keiichi; Sakaguchi, Kei; Araki, Kiyomichi

    Setting relays can address the shadowing problem between a transmitter (Tx) and a receiver (Rx). Moreover, the Multiple-Input Multiple-Output (MIMO) technique has been introduced to improve wireless link capacity. The MIMO technique can be applied in relay network to enhance system performance. However, the efficiency of relaying schemes and relay placement have not been well investigated with experiment-based study. This paper provides a propagation measurement campaign of a MIMO two-hop relay network in 5GHz band in an L-shaped corridor environment with various relay locations. Furthermore, this paper proposes a Relay Placement Estimation (RPE) scheme to identify the optimum relay location, i.e. the point at which the network performance is highest. Analysis results of channel capacity show that relaying technique is beneficial over direct transmission in strong shadowing environment while it is ineffective in non-shadowing environment. In addition, the optimum relay location estimated with the RPE scheme also agrees with the location where the network achieves the highest performance as identified by network capacity. Finally, the capacity analysis shows that two-way MIMO relay employing network coding has the best performance while cooperative relaying scheme is not effective due to shadowing effect weakening the signal strength of the direct link.

  10. Biotechnological advances in the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp.

    PubMed

    Zhao, Guang-Hui; Li, Juan; Blair, David; Li, Xiao-Yan; Elsheikha, Hany M; Lin, Rui-Qing; Zou, Feng-Cai; Zhu, Xing-Quan

    2012-01-01

    Schistosomiasis is a serious parasitic disease caused by blood-dwelling flukes of the genus Schistosoma. Throughout the world, schistosomiasis is associated with high rates of morbidity and mortality, with close to 800 million people at risk of infection. Precise methods for identification of Schistosoma species and diagnosis of schistosomiasis are crucial for an enhanced understanding of parasite epidemiology that informs effective antiparasitic treatment and preventive measures. Traditional approaches for the diagnosis of schistosomiasis include etiological, immunological and imaging techniques. Diagnosis of schistosomiasis has been revolutionized by the advent of new molecular technologies to amplify parasite nucleic acids. Among these, polymerase chain reaction-based methods have been useful in the analysis of genetic variation among Schistosoma spp. Mass spectrometry is now extending the range of biological molecules that can be detected. In this review, we summarize traditional, non-DNA-based diagnostic methods and then describe and discuss the current and developing molecular techniques for the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp. These exciting techniques provide foundations for further development of more effective and precise approaches to differentiate schistosomes and diagnose schistosomiasis in the clinic, and also have important implication for exploring novel measures to control schistosomiasis in the near future. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. HomER: a review of time-series analysis methods for near-infrared spectroscopy of the brain

    PubMed Central

    Huppert, Theodore J.; Diamond, Solomon G.; Franceschini, Maria A.; Boas, David A.

    2009-01-01

    Near-infrared spectroscopy (NIRS) is a noninvasive neuroimaging tool for studying evoked hemodynamic changes within the brain. By this technique, changes in the optical absorption of light are recorded over time and are used to estimate the functionally evoked changes in cerebral oxyhemoglobin and deoxyhemoglobin concentrations that result from local cerebral vascular and oxygen metabolic effects during brain activity. Over the past three decades this technology has continued to grow, and today NIRS studies have found many niche applications in the fields of psychology, physiology, and cerebral pathology. The growing popularity of this technique is in part associated with a lower cost and increased portability of NIRS equipment when compared with other imaging modalities, such as functional magnetic resonance imaging and positron emission tomography. With this increasing number of applications, new techniques for the processing, analysis, and interpretation of NIRS data are continually being developed. We review some of the time-series and functional analysis techniques that are currently used in NIRS studies, we describe the practical implementation of various signal processing techniques for removing physiological, instrumental, and motion-artifact noise from optical data, and we discuss the unique aspects of NIRS analysis in comparison with other brain imaging modalities. These methods are described within the context of the MATLAB-based graphical user interface program, HomER, which we have developed and distributed to facilitate the processing of optical functional brain data. PMID:19340120

  12. A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques

    PubMed Central

    Weller, Michael G.

    2012-01-01

    The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539

  13. Functional data analysis on ground reaction force of military load carriage increment

    NASA Astrophysics Data System (ADS)

    Din, Wan Rozita Wan; Rambely, Azmin Sham

    2014-06-01

    Analysis of ground reaction force on military load carriage is done through functional data analysis (FDA) statistical technique. The main objective of the research is to investigate the effect of 10% load increment and to find the maximum suitable load for the Malaysian military. Ten military soldiers age 31 ± 6.2 years, weigh 71.6 ± 10.4 kg and height of 166.3 ± 5.9 cm carrying different military load range from 0% body weight (BW) up to 40% BW participated in an experiment to gather the GRF and kinematic data using Vicon Motion Analysis System, Kirstler force plates and thirty nine body markers. The analysis is conducted in sagittal, medial lateral and anterior posterior planes. The results show that 10% BW load increment has an effect when heel strike and toe-off for all the three planes analyzed with P-value less than 0.001 at 0.05 significant levels. FDA proves to be one of the best statistical techniques in analyzing the functional data. It has the ability to handle filtering, smoothing and curve aligning according to curve features and points of interest.

  14. Predictive analysis effectiveness in determining the epidemic disease infected area

    NASA Astrophysics Data System (ADS)

    Ibrahim, Najihah; Akhir, Nur Shazwani Md.; Hassan, Fadratul Hafinaz

    2017-10-01

    Epidemic disease outbreak had caused nowadays community to raise their great concern over the infectious disease controlling, preventing and handling methods to diminish the disease dissemination percentage and infected area. Backpropagation method was used for the counter measure and prediction analysis of the epidemic disease. The predictive analysis based on the backpropagation method can be determine via machine learning process that promotes the artificial intelligent in pattern recognition, statistics and features selection. This computational learning process will be integrated with data mining by measuring the score output as the classifier to the given set of input features through classification technique. The classification technique is the features selection of the disease dissemination factors that likely have strong interconnection between each other in causing infectious disease outbreaks. The predictive analysis of epidemic disease in determining the infected area was introduced in this preliminary study by using the backpropagation method in observation of other's findings. This study will classify the epidemic disease dissemination factors as the features for weight adjustment on the prediction of epidemic disease outbreaks. Through this preliminary study, the predictive analysis is proven to be effective method in determining the epidemic disease infected area by minimizing the error value through the features classification.

  15. Restoration of recto-verso colour documents using correlated component analysis

    NASA Astrophysics Data System (ADS)

    Tonazzini, Anna; Bedini, Luigi

    2013-12-01

    In this article, we consider the problem of removing see-through interferences from pairs of recto-verso documents acquired either in grayscale or RGB modality. The see-through effect is a typical degradation of historical and archival documents or manuscripts, and is caused by transparency or seeping of ink from the reverse side of the page. We formulate the problem as one of separating two individual texts, overlapped in the recto and verso maps of the colour channels through a linear convolutional mixing operator, where the mixing coefficients are unknown, while the blur kernels are assumed known a priori or estimated off-line. We exploit statistical techniques of blind source separation to estimate both the unknown model parameters and the ideal, uncorrupted images of the two document sides. We show that recently proposed correlated component analysis techniques overcome the already satisfactory performance of independent component analysis techniques and colour decorrelation, when the two texts are even sensibly correlated.

  16. The effect of uncertainties in distance-based ranking methods for multi-criteria decision making

    NASA Astrophysics Data System (ADS)

    Jaini, Nor I.; Utyuzhnikov, Sergei V.

    2017-08-01

    Data in the multi-criteria decision making are often imprecise and changeable. Therefore, it is important to carry out sensitivity analysis test for the multi-criteria decision making problem. The paper aims to present a sensitivity analysis for some ranking techniques based on the distance measures in multi-criteria decision making. Two types of uncertainties are considered for the sensitivity analysis test. The first uncertainty is related to the input data, while the second uncertainty is towards the Decision Maker preferences (weights). The ranking techniques considered in this study are TOPSIS, the relative distance and trade-off ranking methods. TOPSIS and the relative distance method measure a distance from an alternative to the ideal and antiideal solutions. In turn, the trade-off ranking calculates a distance of an alternative to the extreme solutions and other alternatives. Several test cases are considered to study the performance of each ranking technique in both types of uncertainties.

  17. Adaptive noise cancelling and time-frequency techniques for rail surface defect detection

    NASA Astrophysics Data System (ADS)

    Liang, B.; Iwnicki, S.; Ball, A.; Young, A. E.

    2015-03-01

    Adaptive noise cancelling (ANC) is a technique which is very effective to remove additive noises from the contaminated signals. It has been widely used in the fields of telecommunication, radar and sonar signal processing. However it was seldom used for the surveillance and diagnosis of mechanical systems before late of 1990s. As a promising technique it has gradually been exploited for the purpose of condition monitoring and fault diagnosis. Time-frequency analysis is another useful tool for condition monitoring and fault diagnosis purpose as time-frequency analysis can keep both time and frequency information simultaneously. This paper presents an ANC and time-frequency application for railway wheel flat and rail surface defect detection. The experimental results from a scaled roller test rig show that this approach can significantly reduce unwanted interferences and extract the weak signals from strong background noises. The combination of ANC and time-frequency analysis may provide us one of useful tools for condition monitoring and fault diagnosis of railway vehicles.

  18. Evaluation of electrochemical, UV/VIS and Raman spectroelectrochemical detection of Naratriptan with screen-printed electrodes.

    PubMed

    Hernández, Carla Navarro; Martín-Yerga, Daniel; González-García, María Begoña; Hernández-Santos, David; Fanjul-Bolado, Pablo

    2018-02-01

    Naratriptan, active pharmaceutical ingredient with antimigraine activity was electrochemically detected in untreated screen-printed carbon electrodes (SPCEs). Cyclic voltammetry and differential pulse voltammetry were used to carry out quantitative analysis of this molecule (in a Britton-Robinson buffer solution at pH 3.0) through its irreversible oxidation (diffusion controlled) at a potential of +0.75V (vs. Ag pseudoreference electrode). Naratriptan oxidation product is an indole based dimer with a yellowish colour (maximum absorption at 320nm) so UV-VIS spectroelectrochemistry technique was used for the very first time as an in situ characterization and quantification technique for this molecule. A reflection configuration approach allowed its measurement over the untreated carbon based electrode. Finally, time resolved Raman Spectroelectrochemistry is used as a powerful technique to carry out qualitative and quantitative analysis of Naratriptan. Electrochemically treated silver screen-printed electrodes are shown as easy to use and cost-effective SERS substrates for the analysis of Naratriptan. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A reference web architecture and patterns for real-time visual analytics on large streaming data

    NASA Astrophysics Data System (ADS)

    Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer

    2013-12-01

    Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.

  20. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  1. Correlative visualization techniques for multidimensional data

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Goettsche, Craig

    1989-01-01

    Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.

  2. An Analysis Of The Benefits And Application Of Earned Value Management (EVM) Project Management Techniques For Dod Programs That Do Not Meet Dod Policy Thresholds

    DTIC Science & Technology

    2017-12-01

    carefully to ensure only minimum information needed for effective management control is requested.  Requires cost-benefit analysis and PM...baseline offers metrics that highlights performance treads and program variances. This information provides Program Managers and higher levels of...The existing training philosophy is effective only if the managers using the information have well trained and experienced personnel that can

  3. The effect of moisture on the dynamic thermomechanical properties of a graphite/epoxy composite

    NASA Technical Reports Server (NTRS)

    Sykes, G. F.; Burks, H. D.; Nelson, J. B.

    1977-01-01

    A study has been made of the effect of moisture absorption on the dynamic thermomechanical properties of a graphite/epoxy composite recently considered for building primary aircraft structures. Torsional braid analysis (TBA) and thermomechanical analysis (TMA) techniques were used to measure changes in the glass transition temperature (Tg) and the initial softening temperature (heat distortion temperature, HDT) of T-300/5209 graphite/epoxy composites exposed to room temperature water soak.

  4. Test report for single event effects of the 80386DX microprocessor

    NASA Technical Reports Server (NTRS)

    Watson, R. Kevin; Schwartz, Harvey R.; Nichols, Donald K.

    1993-01-01

    The Jet Propulsion Laboratory Section 514 Single Event Effects (SEE) Testing and Analysis Group has performed a series of SEE tests of certain strategic registers of Intel's 80386DX CHMOS 4 microprocessor. Following a summary of the test techniques and hardware used to gather the data, we present the SEE heavy ion and proton test results. We also describe the registers tested, along with a system impact analysis should these registers experience a single event upset.

  5. The effect of two mobilization techniques on dorsiflexion in people with chronic ankle instability.

    PubMed

    Marrón-Gómez, David; Rodríguez-Fernández, Ángel L; Martín-Urrialde, José A

    2015-02-01

    To compare the effect of two manual therapy techniques, mobilization with movement (WB-MWM) and talocrural manipulation (HVLA), for the improvement of ankle dorsiflexion in people with chronic ankle instability (CAI) over 48 h. Randomized controlled clinical trial. University research laboratory. Fifty-two participants (mean ± SD age, 20.7 ± 3.4 years) with CAI were randomized to WB-MWM (n = 18), HVLA (n = 19) or placebo group (n = 15). Weight-bearing ankle dorsiflexion measured with the weight-bearing lunge. Measurements were obtained prior to intervention, immediately after intervention, and 10 min, 24 h and 48 h post-intervention. There was a significant effect × time (F4,192 = 20.65; P < 0.001) and a significant time × group interactions (F8,192 = 6.34; P < 0.001). Post hoc analysis showed a significant increase of ankle dorsiflexion in both WB-MWM and HVLA groups with respect to the placebo group with no differences between both active treatment groups. A single application of WB-MWM or HVLA manual technique improves ankle dorsiflexion in people with CAI, and the effects persist for at least two days. Both techniques have similar effectiveness for improving ankle dorsiflexion although WB-MWM demonstrated greater effect sizes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Classroom Thought, Teacher Questions, and Student Analysis

    ERIC Educational Resources Information Center

    Wilen, William W.; Hogg, James

    1976-01-01

    Discussed is the need for teachers to improve their effectiveness in classroom skills such as questioning techniques. An instructor cognitive operation index is presented. For journal availability, see SO 505 192. (Author/DB)

  7. Effect of chromium and phosphorus on the physical properties of iron and titanium-based amorphous metallic alloy films

    NASA Technical Reports Server (NTRS)

    Distefano, S.; Rameshan, R.; Fitzgerald, D. J.

    1991-01-01

    Amorphous iron and titanium-based alloys containing various amounts of chromium, phosphorus, and boron exhibit high corrosion resistance. Some physical properties of Fe and Ti-based metallic alloy films deposited on a glass substrate by a dc-magnetron sputtering technique are reported. The films were characterized using differential scanning calorimetry, stress analysis, SEM, XRD, SIMS, electron microprobe, and potentiodynamic polarization techniques.

  8. Intercomparison of Lab-Based Soil Water Extraction Methods for Stable Water Isotope Analysis

    NASA Astrophysics Data System (ADS)

    Pratt, D.; Orlowski, N.; McDonnell, J.

    2016-12-01

    The effect of pore water extraction technique on resultant isotopic signature is poorly understood. Here we present results of an intercomparison of five common lab-based soil water extraction techniques: high pressure mechanical squeezing, centrifugation, direct vapor equilibration, microwave extraction, and cryogenic extraction. We applied five extraction methods to two physicochemically different standard soil types (silty sand and clayey loam) that were oven-dried and rewetted with water of known isotopic composition at three different gravimetric water contents (8, 20, and 30%). We tested the null hypothisis that all extraction techniques would provide the same isotopic result independent from soil type and water content. Our results showed that the extraction technique had a significant effect on the soil water isotopic composition. Each method exhibited deviations from spiked reference water, with soil type and water content showing a secondary effect. Cryogenic extraction showed the largest deviations from the reference water, whereas mechanical squeezing and centrifugation provided the closest match to the reference water for both soil types. We also compared results for each extraction technique that produced liquid water on both an OA-ICOS and IRMS; differences between them were negligible.

  9. Non-Intrusive Measurement Techniques Applied to the Hybrid Solid Fuel Degradation

    NASA Astrophysics Data System (ADS)

    Cauty, F.

    2004-10-01

    The knowledge of the solid fuel regression rate and the time evolution of the grain geometry are requested for hybrid motor design and control of its operating conditions. Two non-intrusive techniques (NDT) have been applied to hybrid propulsion : both are based on wave propagation, the X-rays and the ultrasounds, through the materials. X-ray techniques allow local thickness measurements (attenuated signal level) using small probes or 2D images (Real Time Radiography), with a link between the size of field of view and accuracy. Beside the safety hazards associated with the high-intensity X-ray systems, the image analysis requires the use of quite complex post-processing techniques. The ultrasound technique is more widely used in energetic material applications, including hybrid fuels. Depending upon the transducer size and the associated equipment, the application domain is large, from tiny samples to the quad-port wagon wheel grain of the 1.1 MN thrust HPDP motor. The effect of the physical quantities has to be taken into account in the wave propagation analysis. With respect to the various applications, there is no unique and perfect experimental method to measure the fuel regression rate. The best solution could be obtained by combining two techniques at the same time, each technique enhancing the quality of the global data.

  10. Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.

    PubMed

    Card, Alan J; Ward, James R; Clarkson, P John

    2014-01-01

    After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.

  11. Cerebrovascular pattern improved by ozone autohemotherapy: an entropy-based study on multiple sclerosis patients.

    PubMed

    Molinari, Filippo; Rimini, Daniele; Liboni, William; Acharya, U Rajendra; Franzini, Marianno; Pandolfi, Sergio; Ricevuti, Giovanni; Vaiano, Francesco; Valdenassi, Luigi; Simonetti, Vincenzo

    2017-08-01

    Ozone major autohemotherapy is effective in reducing the symptoms of multiple sclerosis (MS) patients, but its effects on brain are still not clear. In this work, we have monitored the changes in the cerebrovascular pattern of MS patients and normal subjects during major ozone autohemotherapy by using near-infrared spectroscopy (NIRS) as functional and vascular technique. NIRS signals are analyzed using a combination of time, time-frequency analysis and nonlinear analysis of intrinsic mode function signals obtained from empirical mode decomposition technique. Our results show that there is an improvement in the cerebrovascular pattern of all subjects indicated by increasing the entropy of the NIRS signals. Hence, we can conclude that the ozone therapy increases the brain metabolism and helps to recover from the lower activity levels which is predominant in MS patients.

  12. Diagnostics of wear in aeronautical systems

    NASA Technical Reports Server (NTRS)

    Wedeven, L. D.

    1979-01-01

    The use of appropriate diagnostic tools for aircraft oil wetted components is reviewed, noting that it can reduce direct operating costs through reduced unscheduled maintenance, particularly in helicopter engine and transmission systems where bearing failures are a significant cost factor. Engine and transmission wear modes are described, and diagnostic methods for oil and wet particle analysis, the spectrometric oil analysis program, chip detectors, ferrography, in-line oil monitor and radioactive isotope tagging are discussed, noting that they are effective over a limited range of particle sizes but compliment each other if used in parallel. Fine filtration can potentially increase time between overhauls, but reduces the effectiveness of conventional oil monitoring techniques so that alternative diagnostic techniques must be used. It is concluded that the development of a diagnostic system should be parallel and integral with the development of a mechanical system.

  13. Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations.

    PubMed

    Kamran, Faisal; Andersen, Peter E

    2015-08-10

    Oblique incidence reflectometry has developed into an effective, noncontact, and noninvasive measurement technology for the quantification of both the reduced scattering and absorption coefficients of a sample. The optical properties are deduced by analyzing only the shape of the reflectance profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical properties in which system demands vary to be able to detect subtle changes in the structure of the medium, translated as measured optical properties. Effects of variation in anisotropy are discussed and results presented. Finally, experimental data of milk products with different fat content are considered as examples for comparison.

  14. Fertility preservation for social indications: a cost-based decision analysis.

    PubMed

    Hirshfeld-Cytron, Jennifer; Grobman, William A; Milad, Magdy P

    2012-03-01

    Age-related infertility remains a problem that assisted reproductive techniques (ART) have limited ability to overcome. Correspondingly, because an increasing number of women are choosing to delay childbearing, fertility preservation strategies, initially intended for patients undergoing gonadotoxic therapies, are being applied to this group of healthy women. Studies supporting the effectiveness of this practice are lacking. Decision analytic techniques. We compared the cost-effectiveness of three strategies for women planning delayed childbearing until age 40: oocyte cryopreservation at age 25, ovarian tissue cryopreservation (OTC) at age 25, and no assisted reproduction until spontaneous conception had been attempted. Not applicable. Not applicable. Cost-effectiveness, which was defined as the cost per live birth. In this analysis, the strategy of foregoing fertility preservation at age 25 and then choosing ART only after not spontaneously conceiving at age 40 was the most cost-effective option. OTC was dominated by the other strategies. Sensitivity analyses demonstrated the robustness of the model; no analysis existed in which OTC was not dominated by oocyte cryopreservation. Increasing the cost of an IVF cycle beyond $22,000 was the only situation in which oocyte cryopreservation was the most preferred strategy. Neither oocyte cryopreservation nor OTC appear to be cost-effective under current circumstances for otherwise healthy women planning delayed childbearing. This analysis should give pause to the current practice of offering fertility preservation based only on the desire for delayed childbearing. Copyright © 2012 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Pneumatic jigging: Influence of operating parameters on separation efficiency of solid waste materials.

    PubMed

    Abd Aziz, Mohd Aizudin; Md Isa, Khairuddin; Ab Rashid, Radzuwan

    2017-06-01

    This article aims to provide insights into the factors that contribute to the separation efficiency of solid particles. In this study, a pneumatic jigging technique was used to assess the separation of solid waste materials that consisted of copper, glass and rubber insulator. Several initial experiments were carried out to evaluate the strengths and limitations of the technique. It is found that despite some limitations of the technique, all the samples prepared for the experiments were successfully separated. The follow-up experiments were then carried out to further assess the separation of copper wire and rubber insulator. The effects of air flow and pulse rates on the separation process were examined. The data for these follow-up experiments were analysed using a sink float analysis technique. The analysis shows that the air flow rate was very important in determining the separation efficiency. However, the separation efficiency may be influenced by the type of materials used.

  16. Women cannot discriminate between different paracervical block techniques applied to opposite sides of the cervix.

    PubMed

    Grossman, R A

    1995-09-01

    The purpose of this study was to determine whether women can discriminate better from less effective paracervical block techniques applied to opposite sides of the cervix. If this discrimination could be made, it would be possible to compare different techniques and thus improve the quality of paracervical anesthesia. Two milliliters of local anesthetic was applied to one side and 6 ml to the other side of volunteers' cervices before cervical dilation. Statistical examination was by sequential analysis. The study was stopped after 47 subjects had entered, when sequential analysis found that there was no significant difference in women's perception of pain. Nine women reported more pain on the side with more anesthesia and eight reported more pain on the side with less anesthesia. Because the amount of anesthesia did not make a difference, the null hypothesis (that women cannot discriminate between different anesthetic techniques) was accepted. Women are not able to discriminate different doses of local anesthetic when applied to opposite sides of the cervix.

  17. Aerodynamics of a linear oscillating cascade

    NASA Technical Reports Server (NTRS)

    Buffum, Daniel H.; Fleeter, Sanford

    1990-01-01

    The steady and unsteady aerodynamics of a linear oscillating cascade are investigated using experimental and computational methods. Experiments are performed to quantify the torsion mode oscillating cascade aerodynamics of the NASA Lewis Transonic Oscillating Cascade for subsonic inlet flowfields using two methods: simultaneous oscillation of all the cascaded airfoils at various values of interblade phase angle, and the unsteady aerodynamic influence coefficient technique. Analysis of these data and correlation with classical linearized unsteady aerodynamic analysis predictions indicate that the wind tunnel walls enclosing the cascade have, in some cases, a detrimental effect on the cascade unsteady aerodynamics. An Euler code for oscillating cascade aerodynamics is modified to incorporate improved upstream and downstream boundary conditions and also the unsteady aerodynamic influence coefficient technique. The new boundary conditions are shown to improve the unsteady aerodynamic influence coefficient technique. The new boundary conditions are shown to improve the unsteady aerodynamic predictions of the code, and the computational unsteady aerodynamic influence coefficient technique is shown to be a viable alternative for calculation of oscillating cascade aerodynamics.

  18. A review on the applications of portable near-infrared spectrometers in the agro-food industry.

    PubMed

    dos Santos, Cláudia A Teixeira; Lopo, Miguel; Páscoa, Ricardo N M J; Lopes, João A

    2013-11-01

    Industry has created the need for a cost-effective and nondestructive quality-control analysis system. This requirement has increased interest in near-infrared (NIR) spectroscopy, leading to the development and marketing of handheld devices that enable new applications that can be implemented in situ. Portable NIR spectrometers are powerful instruments offering several advantages for nondestructive, online, or in situ analysis: small size, low cost, robustness, simplicity of analysis, sample user interface, portability, and ergonomic design. Several studies of on-site NIR applications are presented: characterization of internal and external parameters of fruits and vegetables; conservation state and fat content of meat and fish; distinguishing among and quality evaluation of beverages and dairy products; protein content of cereals; evaluation of grape ripeness in vineyards; and soil analysis. Chemometrics is an essential part of NIR spectroscopy manipulation because wavelength-dependent scattering effects, instrumental noise, ambient effects, and other sources of variability may complicate the spectra. As a consequence, it is difficult to assign specific absorption bands to specific functional groups. To achieve useful and meaningful results, multivariate statistical techniques (essentially involving regression techniques coupled with spectral preprocessing) are therefore required to extract the information hidden in the spectra. This work reviews the evolution of the use of portable near-infrared spectrometers in the agro-food industry.

  19. Bridging the gap between high and low acceleration for planetary escape

    NASA Astrophysics Data System (ADS)

    Indrikis, Janis; Preble, Jeffrey C.

    With the exception of the often time consuming analysis by numerical optimization, no single orbit transfer analysis technique exists that can be applied over a wide range of accelerations. Using the simple planetary escape (parabolic trajectory) mission some of the more common techniques are considered as the limiting bastions at the high and the extremely low acceleration regimes. The brachistochrone, the minimum time of flight path, is proposed as the technique to bridge the gap between the high and low acceleration regions, providing a smooth bridge over the entire acceleration spectrum. A smooth and continuous velocity requirement is established for the planetary escape mission. By using these results, it becomes possible to determine the effect of finite accelerations on mission performance and target propulsion and power system designs which are consistent with a desired mission objective.

  20. Direct solar energy conversion for large scale terrestrial use

    NASA Technical Reports Server (NTRS)

    Boeer, K. W.; Meakin, J. D.

    1975-01-01

    Various techniques to increase the open circuit voltage are being explored. It had been previously observed that cells made on CdS deposited from a single source gave a consistently higher V sub oc. Further tests have now shown that this effect may in fact relate to differences in source and substrate temperatures. The resulting differences in CdS structure and crystallinity are being documented. Deposits of mixed CdS and ZnS are being produced and will be initially made into cells using the conventional barriering technique. Analysis of I-V characteristics at temperatures between 25 and 110 C is being perfected to provide nondestructive analysis of the Cu2S. Changes due to vacuum heat treatments and exposure to oxygen are also being monitored by the same technique. Detailed spectral response measurements are being made.

  1. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Astrophysics Data System (ADS)

    Cofer, Wesley R.; Edahl, Robert A.

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH 2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g. CH 2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH 2O at global background levels (˜ 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH 2O, once concentrated, is accomplished using high performance liquid chromatography (HPLC) with ultraviolet photometric detection. The CH 2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H 2SO 4 acidified aqueous solution, is detected as CH 2O.

  2. A new technique for collection, concentration and determination of gaseous tropospheric formaldehyde

    NASA Technical Reports Server (NTRS)

    Cofer, W. R., III; Edahl, R. A., Jr.

    1986-01-01

    This article describes an improved technique for making in situ measurements of gaseous tropospheric formaldehyde (CH2O). The new technique is based on nebulization/reflux principles that have proved very effective in quantitatively scrubbing water soluble trace gases (e.g., CH2O) into aqueous mediums, which are subsequently analyzed. Atmospheric formaldehyde extractions and analyses have been performed with the nebulization/reflux concentrator using an acidified dinitrophenylhydrazine solution that indicate that quantitative analysis of CH2O at global background levels (about 0.1 ppbv) is feasible with 20-min extractions. Analysis of CH2O, once concentrated, is accomplished using high performance liquid chromatography with ultraviolet photometric detection. The CH2O-hydrazone derivative, produced by the reaction of 2,4-dinitrophenylhydrazine in H2SO4 acidified aqueous solution, is detected as CH2O.

  3. Designing optical metamaterial with hyperbolic dispersion based on Al:ZnO/ZnO nano-layered structure using Atomic Layer Deposition technique

    DOE PAGES

    Kelly, Priscilla; Liu, Mingzhao; Kuznetsova, Lyuba

    2016-04-07

    In this study, nano-layered Al:ZnO/ZnO hyperbolic dispersion metamaterial with a large number of layers was fabricated using the atomic layer deposition (ALD) technique. Experimental dielectric functions for Al:ZnO/ZnO structures are obtained by an ellipsometry technique in the visible and near-infrared spectral ranges. The theoretical modeling of the Al:ZnO/ZnO dielectric permittivity is done using effective medium approximation. A method for analysis of spectroscopic ellipsometry data is demonstrated to extract the optical permittivity for this highly anisotropic nano-layered metamaterial. The results of the ellipsometry analysis show that Al:ZnO/ZnO structures with a 1:9 ALD cycle ratio exhibit hyperbolic dispersion transition change near 1.8more » μm wavelength.« less

  4. Advances in contact algorithms and their application to tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Tanner, John A.

    1988-01-01

    Currently used techniques for tire contact analysis are reviewed. Discussion focuses on the different techniques used in modeling frictional forces and the treatment of contact conditions. A status report is presented on a new computational strategy for the modeling and analysis of tires, including the solution of the contact problem. The key elements of the proposed strategy are: (1) use of semianalytic mixed finite elements in which the shell variables are represented by Fourier series in the circumferential direction and piecewise polynomials in the meridional direction; (2) use of perturbed Lagrangian formulation for the determination of the contact area and pressure; and (3) application of multilevel iterative procedures and reduction techniques to generate the response of the tire. Numerical results are presented to demonstrate the effectiveness of a proposed procedure for generating the tire response associated with different Fourier harmonics.

  5. Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Basu, S. N.

    1984-01-01

    Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.

  6. iLift: A health behavior change support system for lifting and transfer techniques to prevent lower-back injuries in healthcare.

    PubMed

    Kuipers, Derek A; Wartena, Bard O; Dijkstra, Boudewijn H; Terlouw, Gijs; van T Veer, Job T B; van Dijk, Hylke W; Prins, Jelle T; Pierie, Jean Pierre E N

    2016-12-01

    Lower back problems are a common cause of sick leave of employees in Dutch care homes and hospitals. In the Netherlands over 40% of reported sick leave is due to back problems, mainly caused by carrying out heavy work. The goal of the iLift project was to develop a game for nursing personnel to train them in lifting and transfer techniques. The main focus was not on testing for the effectiveness of the game itself, but rather on the design of the game as an autogenous trigger and its place in a behavioral change support system. In this article, the design and development of such a health behavior change support system is addressed, describing cycles of design and evaluation. (a) To define the problem space, use context and user context, focus group interviews were conducted with Occupational Therapists (n=4), Nurses (n=10) and Caregivers (n=12) and a thematic analysis was performed. We interviewed experts (n=5) on the subject of lifting and transferring techniques. (b) A design science research approach resulted in a playable prototype. An expert panel conducted analysis of video-recorded playing activities. (c) Field experiment: We performed a dynamic analysis in order to investigate the feasibility of the prototype through biometric data from player sessions (n=620) by healthcare professionals (n=37). (a) Occupational Therapists, Nurses and Caregivers did not recognise a lack of knowledge with training in lifting and transferring techniques. All groups considered their workload, time pressure and a culturally determined habit to place the patient's well being above their own as the main reason not to apply appropriate lifting and transferring techniques. This led to a shift in focus from a serious game teaching lifting and transferring techniques to a health behavior change support system containing a game with the intention to influence behavior. (b) Building and testing (subcomponents of) the prototype resulted in design choices regarding players perspective, auditory and visual feedback, overall playability and perceived immersiveness. This design process also addressed the behavior shaping capacities of the game and its place within the health behavior change support system. An expert panel on lifting and transferring techniques validated the provoked in-game activities as being authentic. (c) Regression analysis showed an increase of the game score and dashboard score when more sessions were played, indicating an in-game training effect. A post-hoc test revealed that from an average of 10 playing sessions or more, the dashboard score and the game score align, which indicates behavioral change towards executing appropriate static lifting and transferring techniques. Data gathered in the final field test shows an in-game training effect, causing players to exhibit correct techniques for static lifting and transferring techniques but also revealed the necessity for future social system development and especially regarding intervention acceptance. Social system factors showed a strong impact on the games persuasive capacities and its autogenous intent. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Laparoscopic versus Open Peritoneal Dialysis Catheter Insertion: A Meta-Analysis

    PubMed Central

    Hagen, Sander M.; Lafranca, Jeffrey A.; Steyerberg, Ewout W.; IJzermans, Jan N. M.; Dor, Frank J. M. F.

    2013-01-01

    Background Peritoneal dialysis is an effective treatment for end-stage renal disease. Key to successful peritoneal dialysis is a well-functioning catheter. The different insertion techniques may be of great importance. Mostly, the standard operative approach is the open technique; however, laparoscopic insertion is increasingly popular. Catheter malfunction is reported up to 35% for the open technique and up to 13% for the laparoscopic technique. However, evidence is lacking to definitely conclude that the laparoscopic approach is to be preferred. This review and meta-analysis was carried out to investigate if one of the techniques is superior to the other. Methods Comprehensive searches were conducted in MEDLINE, Embase and CENTRAL (the Cochrane Library 2012, issue 10). Reference lists were searched manually. The methodology was in accordance with the Cochrane Handbook for interventional systematic reviews, and written based on the PRISMA-statement. Results Three randomized controlled trials and eight cohort studies were identified. Nine postoperative outcome measures were meta-analyzed; of these, seven were not different between operation techniques. Based on the meta-analysis, the proportion of migrating catheters was lower (odds ratio (OR) 0.21, confidence interval (CI) 0.07 to 0.63; P = 0.006), and the one-year catheter survival was higher in the laparoscopic group (OR 3.93, CI 1.80 to 8.57; P = 0.0006). Conclusions Based on these results there is some evidence in favour of the laparoscopic insertion technique for having a higher one-year catheter survival and less migration, which would be clinically relevant. PMID:23457554

  8. An analysis of the effect of defect structures on catalytic surfaces by the boundary element technique

    NASA Astrophysics Data System (ADS)

    Peirce, Anthony P.; Rabitz, Herschel

    1988-08-01

    The boundary element (BE) technique is used to analyze the effect of defects on one-dimensional chemically active surfaces. The standard BE algorithm for diffusion is modified to include the effects of bulk desorption by making use of an asymptotic expansion technique to evaluate influences near boundaries and defect sites. An explicit time evolution scheme is proposed to treat the non-linear equations associated with defect sites. The proposed BE algorithm is shown to provide an efficient and convergent algorithm for modelling localized non-linear behavior. Since it exploits the actual Green's function of the linear diffusion-desorption process that takes place on the surface, the BE algorithm is extremely stable. The BE algorithm is applied to a number of interesting physical problems in which non-linear reactions occur at localized defects. The Lotka-Volterra system is considered in which the source, sink and predator-prey interaction terms are distributed at different defect sites in the domain and in which the defects are coupled by diffusion. This example provides a stringent test of the stability of the numerical algorithm. Marginal stability oscillations are analyzed for the Prigogine-Lefever reaction that occurs on a lattice of defects. Dissipative effects are observed for large perturbations to the marginal stability state, and rapid spatial reorganization of uniformly distributed initial perturbations is seen to take place. In another series of examples the effect of defect locations on the balance between desorptive processes on chemically active surfaces is considered. The effect of dynamic pulsing at various time-scales is considered for a one species reactive trapping model. Similar competitive behavior between neighboring defects previously observed for static adsorption levels is shown to persist for dynamic loading of the surface. The analysis of a more complex three species reaction process also provides evidence of competitive behavior between neighboring defect sites. The proposed BE algorithm is shown to provide a useful technique for analyzing the effect of defect sites on chemically active surfaces.

  9. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  10. Comparison of different techniques for in microgravity-a simple mathematic estimation of cardiopulmonary resuscitation quality for space environment.

    PubMed

    Braunecker, S; Douglas, B; Hinkelbein, J

    2015-07-01

    Since astronauts are selected carefully, are usually young, and are intensively observed before and during training, relevant medical problems are rare. Nevertheless, there is a certain risk for a cardiac arrest in space requiring cardiopulmonary resuscitation (CPR). Up to now, there are 5 known techniques to perform CPR in microgravity. The aim of the present study was to analyze different techniques for CPR during microgravity about quality of CPR. To identify relevant publications on CPR quality in microgravity, a systematic analysis with defined searching criteria was performed in the PubMed database (http://www.pubmed.com). For analysis, the keywords ("reanimation" or "CPR" or "resuscitation") and ("space" or "microgravity" or "weightlessness") and the specific names of the techniques ("Standard-technique" or "Straddling-manoeuvre" or "Reverse-bear-hug-technique" or "Evetts-Russomano-technique" or "Hand-stand-technique") were used. To compare quality and effectiveness of different techniques, we used the compression product (CP), a mathematical estimation for cardiac output. Using the predefined keywords for literature search, 4 different publications were identified (parabolic flight or under simulated conditions on earth) dealing with CPR efforts in microgravity and giving specific numbers. No study was performed under real-space conditions. Regarding compression depth, the handstand (HS) technique as well as the reverse bear hug (RBH) technique met parameters of the guidelines for CPR in 1G environments best (HS ratio, 0.91 ± 0.07; RBH ratio, 0.82 ± 0.13). Concerning compression rate, 4 of 5 techniques reached the required compression rate (ratio: HS, 1.08 ± 0.11; Evetts-Russomano [ER], 1.01 ± 0.06; standard side straddle, 1.00 ± 0.03; and straddling maneuver, 1.03 ± 0.12). The RBH method did not meet the required criteria (0.89 ± 0.09). The HS method showed the highest cardiac output (69.3% above the required CP), followed by the ER technique (33.0% above the required CP). Concerning CPR quality, the HS seems to be most effective to treat a cardiac arrest. In some environmental conditions where this technique cannot be used, the ER technique is a good alternative because CPR quality is only slightly lower. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Strengthening three-leaf masonry with basalt fibre: Experimental and numerical data

    NASA Astrophysics Data System (ADS)

    Monni, Francesco; Quagliarini, Enrico; Lenci, Stefano; Maracchini, Gianluca

    2017-07-01

    This paper presents the first results of a study aimed at evaluate the effectiveness of a strengthening technique able to connect masonry elements, stitching them, based on the use of basalt fibre ropes. To assess the effectiveness of proposed technique, experimental and FEM analysis has been performed. The reproduced masonry is the "three-leaf wall", where an inner core of rubble material is included between two outer brick shell, a masonry typology often found in Italian historical building heritage. The results indicate the efficacy of this dry retrofitting system, increasing the performance of masonry wall specimens.

  12. New technique for simulation of optical fiber amplifiers control schemes in dynamic WDM systems

    NASA Astrophysics Data System (ADS)

    Freitas, Marcio; Klein, Jackson; Givigi, Sidney, Jr.; Calmon, Luiz C.

    2005-04-01

    One topic that has attracted attention is related to the behavior of the optical amplifiers under dynamic conditions, specifically because amplifiers working in a saturated condition produce power transients in all-optical reconfigurable WDM networks, e.g. adding/dropping channels. The goal of this work is to introduce the multiwavelength time-driven simulations technique, capable of simulation and analysis of transient effects in all-optical WDM networks with optical amplifiers, and allow the use of control schemes to avoid or minimize the impacts of transient effects in the system performance.

  13. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  14. Fully-automated, high-throughput micro-computed tomography analysis of body composition enables therapeutic efficacy monitoring in preclinical models.

    PubMed

    Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D

    2015-11-01

    The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.

  15. Predicting neuropathic ulceration: analysis of static temperature distributions in thermal images

    NASA Astrophysics Data System (ADS)

    Kaabouch, Naima; Hu, Wen-Chen; Chen, Yi; Anderson, Julie W.; Ames, Forrest; Paulson, Rolf

    2010-11-01

    Foot ulcers affect millions of Americans annually. Conventional methods used to assess skin integrity, including inspection and palpation, may be valuable approaches, but they usually do not detect changes in skin integrity until an ulcer has already developed. We analyze the feasibility of thermal imaging as a technique to assess the integrity of the skin and its many layers. Thermal images are analyzed using an asymmetry analysis, combined with a genetic algorithm, to examine the infrared images for early detection of foot ulcers. Preliminary results show that the proposed technique can reliably and efficiently detect inflammation and hence effectively predict potential ulceration.

  16. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  17. Sounding rocket thermal analysis techniques applied to GAS payloads. [Get Away Special payloads (STS)

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1979-01-01

    Simplified analytical techniques of sounding rocket programs are suggested as a means of bringing the cost of thermal analysis of the Get Away Special (GAS) payloads within acceptable bounds. Particular attention is given to two methods adapted from sounding rocket technology - a method in which the container and payload are assumed to be divided in half vertically by a thermal plane of symmetry, and a method which considers the container and its payload to be an analogous one-dimensional unit having the real or correct container top surface area for radiative heat transfer and a fictitious mass and geometry which model the average thermal effects.

  18. All-digital precision processing of ERTS images

    NASA Technical Reports Server (NTRS)

    Bernstein, R. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Digital techniques have been developed and used to apply precision-grade radiometric and geometric corrections to ERTS MSS and RBV scenes. Geometric accuracies sufficient for mapping at 1:250,000 scale have been demonstrated. Radiometric quality has been superior to ERTS NDPF precision products. A configuration analysis has shown that feasible, cost-effective all-digital systems for correcting ERTS data are easily obtainable. This report contains a summary of all results obtained during this study and includes: (1) radiometric and geometric correction techniques, (2) reseau detection, (3) GCP location, (4) resampling, (5) alternative configuration evaluations, and (6) error analysis.

  19. Stochastic methods for analysis of power flow in electric networks

    NASA Astrophysics Data System (ADS)

    1982-09-01

    The modeling and effects of probabilistic behavior on steady state power system operation were analyzed. A solution to the steady state network flow equations which adhere both to Kirchoff's Laws and probabilistic laws, using either combinatorial or functional approximation techniques was obtained. The development of sound techniques for producing meaningful data to serve as input is examined. Electric demand modeling, equipment failure analysis, and algorithm development are investigated. Two major development areas are described: a decomposition of stochastic processes which gives stationarity, ergodicity, and even normality; and a powerful surrogate probability approach using proportions of time which allows the calculation of joint events from one dimensional probability spaces.

  20. Preliminary Analysis of Photoreading

    NASA Technical Reports Server (NTRS)

    McNamara, Danielle S.

    2000-01-01

    The purpose of this project was to provide a preliminary analysis of a reading strategy called PhotoReading. PhotoReading is a technique developed by Paul Scheele that claims to increase reading rate to 25,000 words per minute (Scheele, 1993). PhotoReading itself involves entering a "relaxed state" and looking at, but not reading, each page of a text for a brief moment (about I to 2 seconds). While this technique has received attention in the popular press, there had been no objective examinations of the technique's validity. To examine the effectiveness of PhotoReading, the principal investigator (i.e., trainee) participated in a PhotoReading workshop to learn the technique. Parallel versions of two standardized and three experimenter-created reading comprehension tests were administered to the trainee and an expert user of the PhotoReading technique to compare the use of normal reading strategies and the PhotoReading technique by both readers. The results for all measures yielded no benefits of using the PhotoReading technique. The extremely rapid reading rates claimed by PhotoReaders were not observed; indeed, the reading rates were generally comparable to those for normal reading. Moreover, the PhotoReading expert generally showed an increase in reading time when using the PhotoReading technique in comparison to when using normal reading strategies to process text. This increase in reading time when PhotoReading was accompanied by a decrease in text comprehension.

Top