Sample records for effects analysis technique

  1. General methodology: Costing, budgeting, and techniques for benefit-cost and cost-effectiveness analysis

    NASA Technical Reports Server (NTRS)

    Stretchberry, D. M.; Hein, G. F.

    1972-01-01

    The general concepts of costing, budgeting, and benefit-cost ratio and cost-effectiveness analysis are discussed. The three common methods of costing are presented. Budgeting distributions are discussed. The use of discounting procedures is outlined. The benefit-cost ratio and cost-effectiveness analysis is defined and their current application to NASA planning is pointed out. Specific practices and techniques are discussed, and actual costing and budgeting procedures are outlined. The recommended method of calculating benefit-cost ratios is described. A standardized method of cost-effectiveness analysis and long-range planning are also discussed.

  2. Identifying configurations of behavior change techniques in effective medication adherence interventions: a qualitative comparative analysis.

    PubMed

    Kahwati, Leila; Viswanathan, Meera; Golin, Carol E; Kane, Heather; Lewis, Megan; Jacobs, Sara

    2016-05-04

    Interventions to improve medication adherence are diverse and complex. Consequently, synthesizing this evidence is challenging. We aimed to extend the results from an existing systematic review of interventions to improve medication adherence by using qualitative comparative analysis (QCA) to identify necessary or sufficient configurations of behavior change techniques among effective interventions. We used data from 60 studies in a completed systematic review to examine the combinations of nine behavior change techniques (increasing knowledge, increasing awareness, changing attitude, increasing self-efficacy, increasing intention formation, increasing action control, facilitation, increasing maintenance support, and motivational interviewing) among studies demonstrating improvements in adherence. Among the 60 studies, 34 demonstrated improved medication adherence. Among effective studies, increasing patient knowledge was a necessary but not sufficient technique. We identified seven configurations of behavior change techniques sufficient for improving adherence, which together accounted for 26 (76 %) of the effective studies. The intervention configuration that included increasing knowledge and self-efficacy was the most empirically relevant, accounting for 17 studies (50 %) and uniquely accounting for 15 (44 %). This analysis extends the completed review findings by identifying multiple combinations of behavior change techniques that improve adherence. Our findings offer direction for policy makers, practitioners, and future comparative effectiveness research on improving adherence.

  3. Data analysis techniques

    NASA Technical Reports Server (NTRS)

    Park, Steve

    1990-01-01

    A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.

  4. Comparative analysis of techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Hitt, E. F.; Bridgman, M. S.; Robinson, A. C.

    1981-01-01

    Performability analysis is a technique developed for evaluating the effectiveness of fault-tolerant computing systems in multiphase missions. Performability was evaluated for its accuracy, practical usefulness, and relative cost. The evaluation was performed by applying performability and the fault tree method to a set of sample problems ranging from simple to moderately complex. The problems involved as many as five outcomes, two to five mission phases, permanent faults, and some functional dependencies. Transient faults and software errors were not considered. A different analyst was responsible for each technique. Significantly more time and effort were required to learn performability analysis than the fault tree method. Performability is inherently as accurate as fault tree analysis. For the sample problems, fault trees were more practical and less time consuming to apply, while performability required less ingenuity and was more checkable. Performability offers some advantages for evaluating very complex problems.

  5. Combinations of techniques that effectively change health behavior: evidence from Meta-CART analysis.

    PubMed

    Dusseldorp, Elise; van Genugten, Lenneke; van Buuren, Stef; Verheijden, Marieke W; van Empelen, Pepijn

    2014-12-01

    Many health-promoting interventions combine multiple behavior change techniques (BCTs) to maximize effectiveness. Although, in theory, BCTs can amplify each other, the available meta-analyses have not been able to identify specific combinations of techniques that provide synergistic effects. This study overcomes some of the shortcomings in the current methodology by applying classification and regression trees (CART) to meta-analytic data in a special way, referred to as Meta-CART. The aim was to identify particular combinations of BCTs that explain intervention success. A reanalysis of data from Michie, Abraham, Whittington, McAteer, and Gupta (2009) was performed. These data included effect sizes from 122 interventions targeted at physical activity and healthy eating, and the coding of the interventions into 26 BCTs. A CART analysis was performed using the BCTs as predictors and treatment success (i.e., effect size) as outcome. A subgroup meta-analysis using a mixed effects model was performed to compare the treatment effect in the subgroups found by CART. Meta-CART identified the following most effective combinations: Provide information about behavior-health link with Prompt intention formation (mean effect size ḡ = 0.46), and Provide information about behavior-health link with Provide information on consequences and Use of follow-up prompts (ḡ = 0.44). Least effective interventions were those using Provide feedback on performance without using Provide instruction (ḡ = 0.05). Specific combinations of BCTs increase the likelihood of achieving change in health behavior, whereas other combinations decrease this likelihood. Meta-CART successfully identified these combinations and thus provides a viable methodology in the context of meta-analysis.

  6. Towards Effective Clustering Techniques for the Analysis of Electric Power Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hogan, Emilie A.; Cotilla Sanchez, Jose E.; Halappanavar, Mahantesh

    2013-11-30

    Clustering is an important data analysis technique with numerous applications in the analysis of electric power grids. Standard clustering techniques are oblivious to the rich structural and dynamic information available for power grids. Therefore, by exploiting the inherent topological and electrical structure in the power grid data, we propose new methods for clustering with applications to model reduction, locational marginal pricing, phasor measurement unit (PMU or synchrophasor) placement, and power system protection. We focus our attention on model reduction for analysis based on time-series information from synchrophasor measurement devices, and spectral techniques for clustering. By comparing different clustering techniques onmore » two instances of realistic power grids we show that the solutions are related and therefore one could leverage that relationship for a computational advantage. Thus, by contrasting different clustering techniques we make a case for exploiting structure inherent in the data with implications for several domains including power systems.« less

  7. Evaluating the application of failure mode and effects analysis technique in hospital wards: a systematic review

    PubMed Central

    Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat

    2017-01-01

    Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688

  8. Language Sample Analysis and Elicitation Technique Effects in Bilingual Children With and Without Language Impairment.

    PubMed

    Kapantzoglou, Maria; Fergadiotis, Gerasimos; Restrepo, M Adelaida

    2017-10-17

    This study examined whether the language sample elicitation technique (i.e., storytelling and story-retelling tasks with pictorial support) affects lexical diversity (D), grammaticality (grammatical errors per communication unit [GE/CU]), sentence length (mean length of utterance in words [MLUw]), and sentence complexity (subordination index [SI]), which are commonly used indices for diagnosing primary language impairment in Spanish-English-speaking children in the United States. Twenty bilingual Spanish-English-speaking children with typical language development and 20 with primary language impairment participated in the study. Four analyses of variance were conducted to evaluate the effect of language elicitation technique and group on D, GE/CU, MLUw, and SI. Also, 2 discriminant analyses were conducted to assess which indices were more effective for story retelling and storytelling and their classification accuracy across elicitation techniques. D, MLUw, and SI were influenced by the type of elicitation technique, but GE/CU was not. The classification accuracy of language sample analysis was greater in story retelling than in storytelling, with GE/CU and D being useful indicators of language abilities in story retelling and GE/CU and SI in storytelling. Two indices in language sample analysis may be sufficient for diagnosis in 4- to 5-year-old bilingual Spanish-English-speaking children.

  9. Effective self-regulation change techniques to promote mental wellbeing among adolescents: a meta-analysis.

    PubMed

    van Genugten, Lenneke; Dusseldorp, Elise; Massey, Emma K; van Empelen, Pepijn

    2017-03-01

    Mental wellbeing is influenced by self-regulation processes. However, little is known on the efficacy of change techniques based on self-regulation to promote mental wellbeing. The aim of this meta-analysis is to identify effective self-regulation techniques (SRTs) in primary and secondary prevention interventions on mental wellbeing in adolescents. Forty interventions were included in the analyses. Techniques were coded into nine categories of SRTs. Meta-analyses were conducted to identify the effectiveness of SRTs, examining three different outcomes: internalising behaviour, externalising behaviour, and self-esteem. Primary interventions had a small-to-medium ([Formula: see text] = 0.16-0.29) on self-esteem and internalising behaviour. Secondary interventions had a medium-to-large short-term effect (average [Formula: see text] = 0.56) on internalising behaviour and self-esteem. In secondary interventions, interventions including asking for social support [Formula: see text] 95% confidence interval, CI = 1.11-1.98) had a great effect on internalising behaviour. Interventions including monitoring and evaluation had a greater effect on self-esteem [Formula: see text] 95% CI = 0.21-0.57). For primary interventions, there was not a single SRT that was associated with a greater intervention effect on internalising behaviour or self-esteem. No effects were found for externalising behaviours. Self-regulation interventions are moderately effective at improving mental wellbeing among adolescents. Secondary interventions promoting 'asking for social support' and promoting 'monitoring and evaluation' were associated with improved outcomes. More research is needed to identify other SRTs or combinations of SRTs that could improve understanding or optimise mental wellbeing interventions.

  10. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  11. Effective behaviour change techniques in smoking cessation interventions for people with chronic obstructive pulmonary disease: A meta-analysis

    PubMed Central

    Bartlett, Yvonne K; Sheeran, Paschal; Hawley, Mark S

    2014-01-01

    Purpose The purpose of this study was to identify the behaviour change techniques (BCTs) that are associated with greater effectiveness in smoking cessation interventions for people with chronic obstructive pulmonary disease (COPD). Methods A systematic review and meta-analysis was conducted. Web of Knowledge, CINAHL, EMBASE, PsycINFO, and MEDLINE were searched from the earliest date available to December 2012. Data were extracted and weighted average effect sizes calculated; BCTs used were coded according to an existing smoking cessation-specific BCT taxonomy. Results Seventeen randomized controlled trials (RCTs) were identified that involved a total sample of 7446 people with COPD. The sample-weighted mean quit rate for all RCTs was 13.19%, and the overall sample-weighted effect size was d+ = 0.33. Thirty-seven BCTs were each used in at least three interventions. Four techniques were associated with significantly larger effect sizes: Facilitate action planning/develop treatment plan, Prompt self-recording, Advise on methods of weight control, and Advise on/facilitate use of social support. Three new COPD-specific BCTs were identified, and Linking COPD and smoking was found to result in significantly larger effect sizes. Conclusions Smoking cessation interventions aimed at people with COPD appear to benefit from using techniques focussed on forming detailed plans and self-monitoring. Additional RCTs that use standardized reporting of intervention components and BCTs would be valuable to corroborate findings from the present meta-analysis. Statement of contribution What is already known on this subject? Chronic obstructive pulmonary disease (COPD) is responsible for considerable health and economic burden worldwide, and smoking cessation (SC) is the only known treatment that can slow the decline in lung function experienced. Previous reviews of smoking cessation interventions for this population have established that a combination of pharmacological support and

  12. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  13. Analysis techniques for residual acceleration data

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Alexander, J. Iwan D.; Snyder, Robert S.

    1990-01-01

    Various aspects of residual acceleration data are of interest to low-gravity experimenters. Maximum and mean values and various other statistics can be obtained from data as collected in the time domain. Additional information may be obtained through manipulation of the data. Fourier analysis is discussed as a means of obtaining information about dominant frequency components of a given data window. Transformation of data into different coordinate axes is useful in the analysis of experiments with different orientations and can be achieved by the use of a transformation matrix. Application of such analysis techniques to residual acceleration data provides additional information than what is provided in a time history and increases the effectiveness of post-flight analysis of low-gravity experiments.

  14. Application of dermoscopy image analysis technique in diagnosing urethral condylomata acuminata.

    PubMed

    Zhang, Yunjie; Jiang, Shuang; Lin, Hui; Guo, Xiaojuan; Zou, Xianbiao

    2018-01-01

    In this study, cases with suspected urethral condylomata acuminata were examined by dermoscopy, in order to explore an effective method for clinical. To study the application of dermoscopy image analysis technique in clinical diagnosis of urethral condylomata acuminata. A total of 220 suspected urethral condylomata acuminata were clinically diagnosed first with the naked eyes, and then by using dermoscopy image analysis technique. Afterwards, a comparative analysis was made for the two diagnostic methods. Among the 220 suspected urethral condylomata acuminata, there was a higher positive rate by dermoscopy examination than visual observation. Dermoscopy examination technique is still restricted by its inapplicability in deep urethral orifice and skin wrinkles, and concordance between different clinicians may also vary. Dermoscopy image analysis technique features a high sensitivity, quick and accurate diagnosis and is non-invasive, and we recommend its use.

  15. LOFT Debriefings: An Analysis of Instructor Techniques and Crew Participation

    NASA Technical Reports Server (NTRS)

    Dismukes, R. Key; Jobe, Kimberly K.; McDonnell, Lori K.

    1997-01-01

    This study analyzes techniques instructors use to facilitate crew analysis and evaluation of their Line-Oriented Flight Training (LOFT) performance. A rating instrument called the Debriefing Assessment Battery (DAB) was developed which enables raters to reliably assess instructor facilitation techniques and characterize crew participation. Thirty-six debriefing sessions conducted at five U.S. airlines were analyzed to determine the nature of instructor facilitation and crew participation. Ratings obtained using the DAB corresponded closely with descriptive measures of instructor and crew performance. The data provide empirical evidence that facilitation can be an effective tool for increasing the depth of crew participation and self-analysis of CRM performance. Instructor facilitation skill varied dramatically, suggesting a need for more concrete hands-on training in facilitation techniques. Crews were responsive but fell short of actively leading their own debriefings. Ways to improve debriefing effectiveness are suggested.

  16. Modular techniques for dynamic fault-tree analysis

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  17. A Unifying Review of Bioassay-Guided Fractionation, Effect-Directed Analysis and Related Techniques

    PubMed Central

    Weller, Michael G.

    2012-01-01

    The success of modern methods in analytical chemistry sometimes obscures the problem that the ever increasing amount of analytical data does not necessarily give more insight of practical relevance. As alternative approaches, toxicity- and bioactivity-based assays can deliver valuable information about biological effects of complex materials in humans, other species or even ecosystems. However, the observed effects often cannot be clearly assigned to specific chemical compounds. In these cases, the establishment of an unambiguous cause-effect relationship is not possible. Effect-directed analysis tries to interconnect instrumental analytical techniques with a biological/biochemical entity, which identifies or isolates substances of biological relevance. Successful application has been demonstrated in many fields, either as proof-of-principle studies or even for complex samples. This review discusses the different approaches, advantages and limitations and finally shows some practical examples. The broad emergence of effect-directed analytical concepts might lead to a true paradigm shift in analytical chemistry, away from ever growing lists of chemical compounds. The connection of biological effects with the identification and quantification of molecular entities leads to relevant answers to many real life questions. PMID:23012539

  18. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  19. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  20. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  1. A Meta-Analysis of the Effectiveness of Alternative Assessment Techniques

    ERIC Educational Resources Information Center

    Gozuyesil, Eda; Tanriseven, Isil

    2017-01-01

    Purpose: Recent trends have encouraged the use of alternative assessment tools in class in line with the recommendations made by the updated curricula. It is of great importance to understand how alternative assessment affects students' academic outcomes and which techniques are most effective in which contexts. This study aims to examine the…

  2. The Effects of Music on Microsurgical Technique and Performance: A Motion Analysis Study.

    PubMed

    Shakir, Afaaf; Chattopadhyay, Arhana; Paek, Laurence S; McGoldrick, Rory B; Chetta, Matthew D; Hui, Kenneth; Lee, Gordon K

    2017-05-01

    Music is commonly played in operating rooms (ORs) throughout the country. If a preferred genre of music is played, surgeons have been shown to perform surgical tasks quicker and with greater accuracy. However, there are currently no studies investigating the effects of music on microsurgical technique. Motion analysis technology has recently been validated in the objective assessment of plastic surgery trainees' performance of microanastomoses. Here, we aimed to examine the effects of music on microsurgical skills using motion analysis technology as a primary objective assessment tool. Residents and fellows in the Plastic and Reconstructive Surgery program were recruited to complete a demographic survey and participate in microsurgical tasks. Each participant completed 2 arterial microanastomoses on a chicken foot model, one with music playing, and the other without music playing. Participants were blinded to the study objectives and encouraged to perform their best. The order of music and no music was randomized. Microanastomoses were video recorded using a digitalized S-video system and deidentified. Video segments were analyzed using ProAnalyst motion analysis software for automatic noncontact markerless video tracking of the needle driver tip. Nine residents and 3 plastic surgery fellows were tested. Reported microsurgical experience ranged from 1 to 10 arterial anastomoses performed (n = 2), 11 to 100 anastomoses (n = 9), and 101 to 500 anastomoses (n = 1). Mean age was 33 years (range, 29-36 years), with 11 participants right-handed and 1 ambidextrous. Of the 12 subjects tested, 11 (92%) preferred music in the OR. Composite instrument motion analysis scores significantly improved with playing preferred music during testing versus no music (paired t test, P <0.001). Improvement with music was significant even after stratifying scores by order in which variables were tested (music first vs no music first), postgraduate year, and number of anastomoses (analysis

  3. Digital techniques for ULF wave polarization analysis

    NASA Technical Reports Server (NTRS)

    Arthur, C. W.

    1979-01-01

    Digital power spectral and wave polarization analysis are powerful techniques for studying ULF waves in the earth's magnetosphere. Four different techniques for using the spectral matrix to perform such an analysis have been presented in the literature. Three of these techniques are similar in that they require transformation of the spectral matrix to the principal axis system prior to performing the polarization analysis. The differences in the three techniques lie in the manner in which determine this transformation. A comparative study of these three techniques using both simulated and real data has shown them to be approximately equal in quality of performance. The fourth technique does not require transformation of the spectral matrix. Rather, it uses the measured spectral matrix and state vectors for a desired wave type to design a polarization detector function in the frequency domain. The design of various detector functions and their application to both simulated and real data will be presented.

  4. Development of a sensitivity analysis technique for multiloop flight control systems

    NASA Technical Reports Server (NTRS)

    Vaillard, A. H.; Paduano, J.; Downing, D. R.

    1985-01-01

    This report presents the development and application of a sensitivity analysis technique for multiloop flight control systems. This analysis yields very useful information on the sensitivity of the relative-stability criteria of the control system, with variations or uncertainties in the system and controller elements. The sensitivity analysis technique developed is based on the computation of the singular values and singular-value gradients of a feedback-control system. The method is applicable to single-input/single-output as well as multiloop continuous-control systems. Application to sampled-data systems is also explored. The sensitivity analysis technique was applied to a continuous yaw/roll damper stability augmentation system of a typical business jet, and the results show that the analysis is very useful in determining the system elements which have the largest effect on the relative stability of the closed-loop system. As a secondary product of the research reported here, the relative stability criteria based on the concept of singular values were explored.

  5. S-192 analysis: Conventional and special data processing techniques. [Michigan

    NASA Technical Reports Server (NTRS)

    Nalepka, R. F. (Principal Investigator); Morganstern, J.; Cicone, R.; Sarno, J.; Lambeck, P.; Malila, W.

    1975-01-01

    The author has identified the following significant results. Multispectral scanner data gathered over test sites in southeast Michigan were analyzed. This analysis showed the data to be somewhat deficient especially in terms of the limited signal range in most SDOs and also in regard to SDO-SDO misregistration. Further analysis showed that the scan line straightening algorithm increased the misregistration of the data. Data were processed using the conic format. The effects of such misregistration on classification accuracy was analyzed via simulation and found to be significant. Results of employing conventional as well as special, unresolved object, processing techniques were disappointing due, at least in part, to the limited signal range and noise content of the data. Application of a second class of special processing techniques, signature extension techniques, yielded better results. Two of the more basic signature extension techniques seemed to be useful in spite of the difficulties.

  6. Application of multivariable statistical techniques in plant-wide WWTP control strategies analysis.

    PubMed

    Flores, X; Comas, J; Roda, I R; Jiménez, L; Gernaey, K V

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation of the complex multicriteria data sets and allows an improved use of information for effective evaluation of control strategies.

  7. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  8. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  9. Flow analysis techniques for phosphorus: an overview.

    PubMed

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  10. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General Services Administration... price analysis technique in order to establish a fair and reasonable price. DATES: Interested parties....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  11. An analysis of the effect of defect structures on catalytic surfaces by the boundary element technique

    NASA Astrophysics Data System (ADS)

    Peirce, Anthony P.; Rabitz, Herschel

    1988-08-01

    The boundary element (BE) technique is used to analyze the effect of defects on one-dimensional chemically active surfaces. The standard BE algorithm for diffusion is modified to include the effects of bulk desorption by making use of an asymptotic expansion technique to evaluate influences near boundaries and defect sites. An explicit time evolution scheme is proposed to treat the non-linear equations associated with defect sites. The proposed BE algorithm is shown to provide an efficient and convergent algorithm for modelling localized non-linear behavior. Since it exploits the actual Green's function of the linear diffusion-desorption process that takes place on the surface, the BE algorithm is extremely stable. The BE algorithm is applied to a number of interesting physical problems in which non-linear reactions occur at localized defects. The Lotka-Volterra system is considered in which the source, sink and predator-prey interaction terms are distributed at different defect sites in the domain and in which the defects are coupled by diffusion. This example provides a stringent test of the stability of the numerical algorithm. Marginal stability oscillations are analyzed for the Prigogine-Lefever reaction that occurs on a lattice of defects. Dissipative effects are observed for large perturbations to the marginal stability state, and rapid spatial reorganization of uniformly distributed initial perturbations is seen to take place. In another series of examples the effect of defect locations on the balance between desorptive processes on chemically active surfaces is considered. The effect of dynamic pulsing at various time-scales is considered for a one species reactive trapping model. Similar competitive behavior between neighboring defects previously observed for static adsorption levels is shown to persist for dynamic loading of the surface. The analysis of a more complex three species reaction process also provides evidence of competitive behavior between

  12. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bihn T. Pham; Jeffrey J. Einerson

    2010-06-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automatedmore » processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.« less

  13. The Effectiveness of Emotional Freedom Techniques in the Treatment of Posttraumatic Stress Disorder: A Meta-Analysis.

    PubMed

    Sebastian, Brenda; Nelms, Jerrod

    Over the past two decades, growing numbers of clinicians have been utilizing emotional freedom techniques (EFT) in the treatment of posttraumatic stress disorder (PTSD), anxiety, and depression. Randomized controlled trials (RCTs) have shown encouraging outcomes for all three conditions. To assess the efficacy of EFT in treating PTSD by conducting a meta-analysis of existing RCTs. A systematic review of databases was undertaken to identify RCTs investigating EFT in the treatment of PTSD. The RCTs were evaluated for quality using evidence-based standards published by the American Psychological Association Division 12 Task Force on Empirically Validated Therapies. Those meeting the criteria were assessed using a meta-analysis that synthesized the data to determine effect sizes. While uncontrolled outcome studies were excluded, they were examined for clinical implications of treatment that can extend knowledge of this condition. Seven randomized controlled trials were found to meet the criteria and were included in the meta-analysis. A large treatment effect was found, with a weighted Cohen׳s d = 2.96 (95% CI: 1.96-3.97, P < .001) for the studies that compared EFT to usual care or a waitlist. No treatment effect differences were found in studies comparing EFT to other evidence-based therapies such as eye movement desensitization and reprocessing (EMDR; 1 study) and cognitive behavior therapy (CBT; 1 study). The analysis of existing studies showed that a series of 4-10 EFT sessions is an efficacious treatment for PTSD with a variety of populations. The studies examined reported no adverse effects from EFT interventions and showed that it can be used both on a self-help basis and as a primary evidence-based treatment for PTSD. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Signal analysis techniques for incipient failure detection in turbomachinery

    NASA Technical Reports Server (NTRS)

    Coffin, T.

    1985-01-01

    Signal analysis techniques for the detection and classification of incipient mechanical failures in turbomachinery were developed, implemented and evaluated. Signal analysis techniques available to describe dynamic measurement characteristics are reviewed. Time domain and spectral methods are described, and statistical classification in terms of moments is discussed. Several of these waveform analysis techniques were implemented on a computer and applied to dynamic signals. A laboratory evaluation of the methods with respect to signal detection capability is described. Plans for further technique evaluation and data base development to characterize turbopump incipient failure modes from Space Shuttle main engine (SSME) hot firing measurements are outlined.

  15. Efficient calculation of the polarizability: a simplified effective-energy technique

    NASA Astrophysics Data System (ADS)

    Berger, J. A.; Reining, L.; Sottile, F.

    2012-09-01

    In a recent publication [J.A. Berger, L. Reining, F. Sottile, Phys. Rev. B 82, 041103(R) (2010)] we introduced the effective-energy technique to calculate in an accurate and numerically efficient manner the GW self-energy as well as the polarizability, which is required to evaluate the screened Coulomb interaction W. In this work we show that the effective-energy technique can be used to further simplify the expression for the polarizability without a significant loss of accuracy. In contrast to standard sum-over-state methods where huge summations over empty states are required, our approach only requires summations over occupied states. The three simplest approximations we obtain for the polarizability are explicit functionals of an independent- or quasi-particle one-body reduced density matrix. We provide evidence of the numerical accuracy of this simplified effective-energy technique as well as an analysis of our method.

  16. Automated Sneak Circuit Analysis Technique

    DTIC Science & Technology

    1990-06-01

    the OrCAD/SDT module Port facility. 2. The terminals of all in- circuit voltage sources (e , batteries) must be labeled using the OrCAD/SDT module port...ELECTE 1 MAY 2 01994 _- AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUEIt~ w I wtA who RADC 94-14062 Systems Reliability & Engineering Division Rome...Air Develpment Center Best Avai~lable copy AUTOMATED SNEAK CIRCUIT ANALYSIS TECHNIQUE RADC June 1990 Systems Reliability & Engineering Division Rome Air

  17. External tissue expansion for difficult wounds using a simple cost effective technique.

    PubMed

    Nandhagopal, Vijayaraghavan; Chittoria, Ravi Kumar; Mohapatra, Devi Prasad; Thiruvoth, Friji Meethale; Sivakumar, Dinesh Kumar; Ashokan, Arjun

    2015-01-01

    To study and discuss role of external tissue expansion and wound closure (ETEWC) technique using hooks and rubber bands. The present study is a retrospective analysis of nine cases of wounds of different aetiology where ETEWC technique was applied using hooks and rubber bands. All the wounds in the study healed completely without split thickness skin graft (SSG) or flap. ETEWC technique using hooks and rubber bands is a cost-effective technique which can be used for wound closure without SSG or flap.

  18. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yonggang

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less

  19. The balance sheet technique. Volume I. The balance sheet analysis technique for preconstruction review of airports and highways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaBelle, S.J.; Smith, A.E.; Seymour, D.A.

    1977-02-01

    The technique applies equally well to new or existing airports. The importance of accurate accounting of emissions, cannot be overstated. The regional oxidant modelling technique used in conjunction with a balance sheet review must be a proportional reduction technique. This type of emission balancing presumes equality of all sources in the analysis region. The technique can be applied successfully in the highway context, either in planning at the system level or looking only at projects individually. The project-by-project reviews could be used to examine each project in the same way as the airport projects are examined for their impact onmore » regional desired emission levels. The primary limitation of this technique is that it should not be used when simulation models have been used for regional oxidant air quality. In the case of highway projects, the balance sheet technique might appear to be limited; the real limitations are in the transportation planning process. That planning process is not well-suited to the needs of air quality forecasting. If the transportation forecasting techniques are insensitive to change in the variables that affect HC emissions, then no internal emission trade-offs can be identified, and the initial highway emission forecasts are themselves suspect. In general, the balance sheet technique is limited by the quality of the data used in the review. Additionally, the technique does not point out effective trade-off strategies, nor does it indicate when it might be worthwhile to ignore small amounts of excess emissions. Used in the context of regional air quality plans based on proportional reduction models, the balance sheet analysis technique shows promise as a useful method by state or regional reviewing agencies.« less

  20. A diagnostic analysis of the VVP single-doppler retrieval technique

    NASA Technical Reports Server (NTRS)

    Boccippio, Dennis J.

    1995-01-01

    A diagnostic analysis of the VVP (volume velocity processing) retrieval method is presented, with emphasis on understanding the technique as a linear, multivariate regression. Similarities and differences to the velocity-azimuth display and extended velocity-azimuth display retrieval techniques are discussed, using this framework. Conventional regression diagnostics are then employed to quantitatively determine situations in which the VVP technique is likely to fail. An algorithm for preparation and analysis of a robust VVP retrieval is developed and applied to synthetic and actual datasets with high temporal and spatial resolution. A fundamental (but quantifiable) limitation to some forms of VVP analysis is inadequate sampling dispersion in the n space of the multivariate regression, manifest as a collinearity between the basis functions of some fitted parameters. Such collinearity may be present either in the definition of these basis functions or in their realization in a given sampling configuration. This nonorthogonality may cause numerical instability, variance inflation (decrease in robustness), and increased sensitivity to bias from neglected wind components. It is shown that these effects prevent the application of VVP to small azimuthal sectors of data. The behavior of the VVP regression is further diagnosed over a wide range of sampling constraints, and reasonable sector limits are established.

  1. Accelerometer Data Analysis and Presentation Techniques

    NASA Technical Reports Server (NTRS)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  2. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments

    PubMed Central

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana MF; Silva, Rosângela; de Souza, Sheila Mendonça; Araujo, Adauto

    2013-01-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis. PMID:23579793

  3. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound) sediments.

    PubMed

    Camacho, Morgana; Pessanha, Thaíla; Leles, Daniela; Dutra, Juliana M F; Silva, Rosângela; Souza, Sheila Mendonça de; Araujo, Adauto

    2013-04-01

    Parasite findings in sambaquis (shell mounds) are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  4. Analysis and correction of ground reflection effects in measured narrowband sound spectra using cepstral techniques

    NASA Technical Reports Server (NTRS)

    Miles, J. H.; Stevens, G. H.; Leininger, G. G.

    1975-01-01

    Ground reflections generate undesirable effects on acoustic measurements such as those conducted outdoors for jet noise research, aircraft certification, and motor vehicle regulation. Cepstral techniques developed in speech processing are adapted to identify echo delay time and to correct for ground reflection effects. A sample result is presented using an actual narrowband sound pressure level spectrum. The technique can readily be adapted to existing fast Fourier transform type spectrum measurement instrumentation to provide field measurements/of echo time delays.

  5. Short-term effect of aniline on soil microbial activity: a combined study by isothermal microcalorimetry, glucose analysis, and enzyme assay techniques.

    PubMed

    Chen, Huilun; Zhuang, Rensheng; Yao, Jun; Wang, Fei; Qian, Yiguang; Masakorala, Kanaji; Cai, Minmin; Liu, Haijun

    2014-01-01

    The accidents of aniline spill and explosion happened almost every year in China, whereas the toxic effect of aniline on soil microbial activity remained largely unexplored. In this study, isothermal microcalorimetric technique, glucose analysis, and soil enzyme assay techniques were employed to investigate the toxic effect of aniline on microbial activity in Chinese soil for the first time. Soil samples were treated with aniline from 0 to 2.5 mg/g soil to tie in with the fact of aniline spill. Results from microcalorimetric analysis showed that the introduction of aniline had a significant adverse effect on soil microbial activity at the exposure concentrations ≥0.4 mg/g soil (p < 0.05) and ≥0.8 mg/g soil (p < 0.01), and the activity was totally inhibited when the concentration increased to 2.5 mg/g soil. The glucose analysis indicated that aniline significantly decreased the soil microbial respiratory activity at the concentrations ≥0.8 mg/g soil (p < 0.05) and ≥1.5 mg/g soil (p < 0.01). Soil enzyme activities for β-glucosidase, urease, acid-phosphatase, and dehydrogenase revealed that aniline had a significant effect (p < 0.05) on the nutrient cycling of C, N, and P as well as the oxidative capacity of soil microorganisms, respectively. All of these results showed an intensively toxic effect of aniline on soil microbial activity. The proposed methods can provide toxicological information of aniline to soil microbes from the metabolic and biochemical point of views which are consistent with and correlated to each other.

  6. BATMAN: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  7. A double sealing technique for increasing the precision of headspace-gas chromatographic analysis.

    PubMed

    Xie, Wei-Qi; Yu, Kong-Xian; Gong, Yi-Xian

    2018-01-19

    This paper investigates a new double sealing technique for increasing the precision of the headspace gas chromatographic method. The air leakage problem caused by the high pressure in the headspace vial during the headspace sampling process has a great impact to the measurement precision in the conventional headspace analysis (i.e., single sealing technique). The results (using ethanol solution as the model sample) show that the present technique is effective to minimize such a problem. The double sealing technique has an excellent measurement precision (RSD < 0.15%) and accuracy (recovery = 99.1%-100.6%) for the ethanol quantification. The detection precision of the present method was 10-20 times higher than that in earlier HS-GC work that use conventional single sealing technique. The present double sealing technique may open up a new avenue, and also serve as a general strategy for improving the performance (i.e., accuracy and precision) of headspace analysis of various volatile compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Investigating effects of communications modulation technique on targeting performance

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Eusebio, Gerald; Huling, Edward

    2006-05-01

    One of the key challenges facing the global war on terrorism (GWOT) and urban operations is the increased need for rapid and diverse information from distributed sources. For users to get adequate information on target types and movements, they would need reliable data. In order to facilitate reliable computational intelligence, we seek to explore the communication modulation tradeoffs affecting information distribution and accumulation. In this analysis, we explore the modulation techniques of Orthogonal Frequency Division Multiplexing (OFDM), Direct Sequence Spread Spectrum (DSSS), and statistical time-division multiple access (TDMA) as a function of the bit error rate and jitter that affect targeting performance. In the analysis, we simulate a Link 16 with a simple bandpass frequency shift keying (PSK) technique using different Signal-to-Noise ratios. The communications transfer delay and accuracy tradeoffs are assessed as to the effects incurred in targeting performance.

  9. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  10. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  11. Application of thermal analysis techniques in activated carbon production

    USGS Publications Warehouse

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  12. Methodology for assessing the effectiveness of access management techniques : executive summary.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  13. Who Should Bear the Cost of Convenience? A Cost-effectiveness Analysis Comparing External Beam and Brachytherapy Radiotherapy Techniques for Early Stage Breast Cancer.

    PubMed

    McGuffin, M; Merino, T; Keller, B; Pignol, J-P

    2017-03-01

    Standard treatment for early breast cancer includes whole breast irradiation (WBI) after breast-conserving surgery. Recently, accelerated partial breast irradiation (APBI) has been proposed for well-selected patients. A cost and cost-effectiveness analysis was carried out comparing WBI with two APBI techniques. An activity-based costing method was used to determine the treatment cost from a societal perspective of WBI, high dose rate brachytherapy (HDR) and permanent breast seed implants (PBSI). A Markov model comparing the three techniques was developed with downstream costs, utilities and probabilities adapted from the literature. Sensitivity analyses were carried out for a wide range of variables, including treatment costs, patient costs, utilities and probability of developing recurrences. Overall, HDR was the most expensive ($14 400), followed by PBSI ($8700), with WBI proving the least expensive ($6200). The least costly method to the health care system was WBI, whereas PBSI and HDR were less costly for the patient. Under cost-effectiveness analyses, downstream costs added about $10 000 to the total societal cost of the treatment. As the outcomes are very similar between techniques, WBI dominated under cost-effectiveness analyses. WBI was found to be the most cost-effective radiotherapy technique for early breast cancer. However, both APBI techniques were less costly to the patient. Although innovation may increase costs for the health care system it can provide cost savings for the patient in addition to convenience. Copyright © 2016 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  14. The Technique of Special Effects in Television.

    ERIC Educational Resources Information Center

    Wilkie, Bernard

    Television, with its special need for movement and continuous shooting, often demands different special effects techniques from those used in films. This book covers the techniques used to create special effects for television which meet these requirements and which also require less time and money than many film techniques. Included are…

  15. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  16. The palisade cartilage tympanoplasty technique: a systematic review and meta-analysis.

    PubMed

    Jeffery, Caroline C; Shillington, Cameron; Andrews, Colin; Ho, Allan

    2017-06-17

    Tympanoplasty is a common procedure performed by Otolaryngologists. Many types of autologous grafts have been used with variations of techniques with varying results. This is the first systematic review of the literature and meta-analysis with the aim to evaluate the effectiveness of one of the techniques which is gaining popularity, the palisade cartilage tympanoplasty. PubMed, EMBASE, and Cochrane databases were searched for "palisade", "cartilage", "tympanoplasty", "perforation" and their synonyms. In total, 199 articles reporting results of palisade cartilage tympanoplasty were identified. Five articles satisfied the following inclusion criteria: adult patients, minimum 6 months follow-up, hearing and surgical outcomes reported. Studies with patients undergoing combined mastoidectomy, ossicular chain reconstruction, and/or other middle ear surgery were excluded. Perforation closure, rate of complications, and post-operative pure-tone average change were extracted for pooled analysis. Study failure and complication proportions that were used to generate odds ratios were pooled. Fixed effects and random effects weightings were generated. The resulting pooled odds ratios are reported. Palisade cartilage tympanoplasty has an overall take rate of 96% at beyond 6 months and has similar odds of complications compared to temporalis fascia (OR 0.89, 95% CI 0.62, 1.30). The air-bone gap closure is statistically similar to reported results from temporalis fascia tympanoplasty. Cartilage palisade tympanoplasty offers excellent graft take rates and good postoperative hearing outcomes for perforations of various sizes and for both primary and revision cases. This technique has predictable, long-term results with low complication rates, similar to temporalis fascia tympanoplasty.

  17. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 3 2011-10-01 2011-10-01 false Proposal analysis... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for proposal analysis. (2) For spare parts or support equipment, perform an analysis of— (i) Those line items...

  18. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  19. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  20. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  1. Determining the Number of Factors in P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  2. Analysis of questioning technique during classes in medical education.

    PubMed

    Cho, Young Hye; Lee, Sang Yeoup; Jeong, Dong Wook; Im, Sun Ju; Choi, Eun Jung; Lee, Sun Hee; Baek, Sun Yong; Kim, Yun Jin; Lee, Jeong Gyu; Yi, Yu Hyone; Bae, Mi Jin; Yune, So Jung

    2012-06-12

    Questioning is one of the essential techniques used by lecturers to make lectures more interactive and effective. This study surveyed the perception of questioning techniques by medical school faculty members and analyzed how the questioning technique is used in actual classes. Data on the perceptions of the questioning skills used during lectures was collected using a self-questionnaire for faculty members (N = 33) during the second semester of 2008. The questionnaire consisted of 18 items covering the awareness and characteristics of questioning skills. Recorded video tapes were used to observe the faculty members' questioning skills. Most faculty members regarded the questioning technique during classes as being important and expected positive outcomes in terms of the students' participation in class, concentration in class and understanding of the class contents. In the 99 classes analyzed, the median number of questions per class was 1 (0-29). Among them, 40 classes (40.4 %) did not use questioning techniques. The frequency of questioning per lecture was similar regardless of the faculty members' perception. On the other hand, the faculty members perceived that their usual wait time after question was approximately 10 seconds compared to only 2.5 seconds measured from video analysis. More lecture-experienced faculty members tended to ask more questions in class. There were some discrepancies regarding the questioning technique between the faculty members' perceptions and reality, even though they had positive opinions of the technique. The questioning skills during a lecture need to be emphasized to faculty members.

  3. Cognitive task analysis: Techniques applied to airborne weapons training

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented alongmore » with the results. 6 refs., 2 figs., 4 tabs.« less

  4. Predicting Effective Course Conduction Strategy Using Datamining Techniques

    ERIC Educational Resources Information Center

    Parkavi, A.; Lakshmi, K.; Srinivasa, K. G.

    2017-01-01

    Data analysis techniques can be used to analyze the pattern of data in different fields. Based on the analysis' results, it is recommended that suggestions be provided to decision making authorities. The data mining techniques can be used in educational domain to improve the outcome of the educational sectors. The authors carried out this research…

  5. Review and classification of variability analysis techniques with clinical applications.

    PubMed

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  6. Review and classification of variability analysis techniques with clinical applications

    PubMed Central

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  7. The Technique of Special-Effects Cinematography.

    ERIC Educational Resources Information Center

    Fielding, Raymond

    The author describes the many techniques used to produce cinematic effects that would be too costly, too difficult, too time-consuming, too dangerous, or simply impossible to achieve with conventional photographic techniques. He points out that these techniques are available not only for 35 millimeter work but also to the 16 mm. photographer who…

  8. Random safety auditing, root cause analysis, failure mode and effects analysis.

    PubMed

    Ursprung, Robert; Gray, James

    2010-03-01

    Improving quality and safety in health care is a major concern for health care providers, the general public, and policy makers. Errors and quality issues are leading causes of morbidity and mortality across the health care industry. There is evidence that patients in the neonatal intensive care unit (NICU) are at high risk for serious medical errors. To facilitate compliance with safe practices, many institutions have established quality-assurance monitoring procedures. Three techniques that have been found useful in the health care setting are failure mode and effects analysis, root cause analysis, and random safety auditing. When used together, these techniques are effective tools for system analysis and redesign focused on providing safe delivery of care in the complex NICU system. Copyright 2010 Elsevier Inc. All rights reserved.

  9. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    PubMed

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  10. A methodological comparison of customer service analysis techniques

    Treesearch

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  11. Statistical Techniques for Assessing water‐quality effects of BMPs

    USGS Publications Warehouse

    Walker, John F.

    1994-01-01

    Little has been published on the effectiveness of various management practices in small rural lakes and streams at the watershed scale. In this study, statistical techniques were used to test for changes in water‐quality data from watersheds where best management practices (BMPs) were implemented. Reductions in data variability due to climate and seasonality were accomplished through the use of regression methods. This study discusses the merits of using storm‐mass‐transport data as a means of improving the ability to detect BMP effects on stream‐water quality. Statistical techniques were applied to suspended‐sediment records from three rural watersheds in Illinois for the period 1981–84. None of the techniques identified changes in suspended sediment, primarily because of the small degree of BMP implementation and because of potential errors introduced through the estimation of storm‐mass transport. A Monte Carlo sensitivity analysis was used to determine the level of discrete change that could be detected for each watershed. In all cases, the use of regressions improved the ability to detect trends.Read More: http://ascelibrary.org/doi/abs/10.1061/(ASCE)0733-9437(1994)120:2(334)

  12. Model building techniques for analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the productmore » definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.« less

  13. Emotional Freedom Techniques for Anxiety: A Systematic Review With Meta-analysis.

    PubMed

    Clond, Morgan

    2016-05-01

    Emotional Freedom Technique (EFT) combines elements of exposure and cognitive therapies with acupressure for the treatment of psychological distress. Randomized controlled trials retrieved by literature search were assessed for quality using the criteria developed by the American Psychological Association's Division 12 Task Force on Empirically Validated Treatments. As of December 2015, 14 studies (n = 658) met inclusion criteria. Results were analyzed using an inverse variance weighted meta-analysis. The pre-post effect size for the EFT treatment group was 1.23 (95% confidence interval, 0.82-1.64; p < 0.001), whereas the effect size for combined controls was 0.41 (95% confidence interval, 0.17-0.67; p = 0.001). Emotional freedom technique treatment demonstrated a significant decrease in anxiety scores, even when accounting for the effect size of control treatment. However, there were too few data available comparing EFT to standard-of-care treatments such as cognitive behavioral therapy, and further research is needed to establish the relative efficacy of EFT to established protocols.

  14. Automated thermal mapping techniques using chromatic image analysis

    NASA Technical Reports Server (NTRS)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  15. Analytical Modelling of the Effects of Different Gas Turbine Cooling Techniques on Engine Performance =

    NASA Astrophysics Data System (ADS)

    Uysal, Selcuk Can

    In this research, MATLAB SimulinkRTM was used to develop a cooled engine model for industrial gas turbines and aero-engines. The model consists of uncooled on-design, mean-line turbomachinery design and a cooled off-design analysis in order to evaluate the engine performance parameters by using operating conditions, polytropic efficiencies, material information and cooling system details. The cooling analysis algorithm involves a 2nd law analysis to calculate losses from the cooling technique applied. The model is used in a sensitivity analysis that evaluates the impacts of variations in metal Biot number, thermal barrier coating Biot number, film cooling effectiveness, internal cooling effectiveness and maximum allowable blade temperature on main engine performance parameters of aero and industrial gas turbine engines. The model is subsequently used to analyze the relative performance impact of employing Anti-Vortex Film Cooling holes (AVH) by means of data obtained for these holes by Detached Eddy Simulation-CFD Techniques that are valid for engine-like turbulence intensity conditions. Cooled blade configurations with AVH and other different external cooling techniques were used in a performance comparison study. (Abstract shortened by ProQuest.).

  16. Development of solution techniques for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Andrews, J. S.

    1974-01-01

    Nonlinear structural solution methods in the current research literature are classified according to order of the solution scheme, and it is shown that the analytical tools for these methods are uniformly derivable by perturbation techniques. A new perturbation formulation is developed for treating an arbitrary nonlinear material, in terms of a finite-difference generated stress-strain expansion. Nonlinear geometric effects are included in an explicit manner by appropriate definition of an applicable strain tensor. A new finite-element pilot computer program PANES (Program for Analysis of Nonlinear Equilibrium and Stability) is presented for treatment of problems involving material and geometric nonlinearities, as well as certain forms on nonconservative loading.

  17. Research to Develop Effective Teaching and Management Techniques for Severely Disturbed and Retarded Children. Final Report.

    ERIC Educational Resources Information Center

    Kauffman, James M.; Birnbrauer, Jay S.

    The final report of a project on teaching and management techniques with severely disturbed and/or retarded children presents analysis of single subject research using contingent imitation of the child as an intervention technique. The effects of this technique were examined on the following behaviors: toyplay and reciprocal imitation, self…

  18. Fit Analysis of Different Framework Fabrication Techniques for Implant-Supported Partial Prostheses.

    PubMed

    Spazzin, Aloísio Oro; Bacchi, Atais; Trevisani, Alexandre; Farina, Ana Paula; Dos Santos, Mateus Bertolini

    2016-01-01

    This study evaluated the vertical misfit of implant-supported frameworks made using different techniques to obtain passive fit. Thirty three-unit fixed partial dentures were fabricated in cobalt-chromium alloy (n = 10) using three fabrication methods: one-piece casting, framework cemented on prepared abutments, and laser welding. The vertical misfit between the frameworks and the abutments was evaluated with an optical microscope using the single-screw test. Data were analyzed using one-way analysis of variance and Tukey test (α = .05). The one-piece casted frameworks presented significantly higher vertical misfit values than those found for framework cemented on prepared abutments and laser welding techniques (P < .001 and P < .003, respectively). Laser welding and framework cemented on prepared abutments are effective techniques to improve the adaptation of three-unit implant-supported prostheses. These techniques presented similar fit.

  19. Fourier Spectroscopy: A Simple Analysis Technique

    ERIC Educational Resources Information Center

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  20. Photomorphic analysis techniques: An interim spatial analysis using satellite remote sensor imagery and historical data

    NASA Technical Reports Server (NTRS)

    Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.

    1977-01-01

    The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.

  1. Chromatographic Techniques for Rare Earth Elements Analysis

    NASA Astrophysics Data System (ADS)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  2. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  3. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  4. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    Progress in the development of system models and techniques for the formulation and evaluation of aircraft computer system effectiveness is reported. Topics covered include: analysis of functional dependence: a prototype software package, METAPHOR, developed to aid the evaluation of performability; and a comprehensive performability modeling and evaluation exercise involving the SIFT computer.

  5. Diffraction analysis of customized illumination technique

    NASA Astrophysics Data System (ADS)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  6. Sex-based differences in lifting technique under increasing load conditions: A principal component analysis.

    PubMed

    Sheppard, P S; Stevenson, J M; Graham, R B

    2016-05-01

    The objective of the present study was to determine if there is a sex-based difference in lifting technique across increasing-load conditions. Eleven male and 14 female participants (n = 25) with no previous history of low back disorder participated in the study. Participants completed freestyle, symmetric lifts of a box with handles from the floor to a table positioned at 50% of their height for five trials under three load conditions (10%, 20%, and 30% of their individual maximum isometric back strength). Joint kinematic data for the ankle, knee, hip, and lumbar and thoracic spine were collected using a two-camera Optotrak motion capture system. Joint angles were calculated using a three-dimensional Euler rotation sequence. Principal component analysis (PCA) and single component reconstruction were applied to assess differences in lifting technique across the entire waveforms. Thirty-two PCs were retained from the five joints and three axes in accordance with the 90% trace criterion. Repeated-measures ANOVA with a mixed design revealed no significant effect of sex for any of the PCs. This is contrary to previous research that used discrete points on the lifting curve to analyze sex-based differences, but agrees with more recent research using more complex analysis techniques. There was a significant effect of load on lifting technique for five PCs of the lower limb (PC1 of ankle flexion, knee flexion, and knee adduction, as well as PC2 and PC3 of hip flexion) (p < 0.005). However, there was no significant effect of load on the thoracic and lumbar spine. It was concluded that when load is standardized to individual back strength characteristics, males and females adopted a similar lifting technique. In addition, as load increased male and female participants changed their lifting technique in a similar manner. Copyright © 2016. Published by Elsevier Ltd.

  7. Error analysis of multi-needle Langmuir probe measurement technique.

    PubMed

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  8. Error analysis of multi-needle Langmuir probe measurement technique

    NASA Astrophysics Data System (ADS)

    Barjatya, Aroh; Merritt, William

    2018-04-01

    Multi-needle Langmuir probe is a fairly new instrument technique that has been flown on several recent sounding rockets and is slated to fly on a subset of QB50 CubeSat constellation. This paper takes a fundamental look into the data analysis procedures used for this instrument to derive absolute electron density. Our calculations suggest that while the technique remains promising, the current data analysis procedures could easily result in errors of 50% or more. We present a simple data analysis adjustment that can reduce errors by at least a factor of five in typical operation.

  9. Statistical model to perform error analysis of curve fits of wind tunnel test data using the techniques of analysis of variance and regression analysis

    NASA Technical Reports Server (NTRS)

    Alston, D. W.

    1981-01-01

    The considered research had the objective to design a statistical model that could perform an error analysis of curve fits of wind tunnel test data using analysis of variance and regression analysis techniques. Four related subproblems were defined, and by solving each of these a solution to the general research problem was obtained. The capabilities of the evolved true statistical model are considered. The least squares fit is used to determine the nature of the force, moment, and pressure data. The order of the curve fit is increased in order to delete the quadratic effect in the residuals. The analysis of variance is used to determine the magnitude and effect of the error factor associated with the experimental data.

  10. Developing techniques for cause-responsibility analysis of occupational accidents.

    PubMed

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Applications Of Binary Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  12. Efficient geometric rectification techniques for spectral analysis algorithm

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  13. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Sensitivity analysis of hybrid thermoelastic techniques

    Treesearch

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  15. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-06-01

    In this research, we examine the Naval Sea Logistics Command s Continuous Integrated Logistics Support Targeted Allowancing Technique (CILS TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS TAT, and provide recommendations concerning possible improvements to the

  16. Estimating the settling velocity of bioclastic sediment using common grain-size analysis techniques

    USGS Publications Warehouse

    Cuttler, Michael V. W.; Lowe, Ryan J.; Falter, James L.; Buscombe, Daniel D.

    2017-01-01

    Most techniques for estimating settling velocities of natural particles have been developed for siliciclastic sediments. Therefore, to understand how these techniques apply to bioclastic environments, measured settling velocities of bioclastic sedimentary deposits sampled from a nearshore fringing reef in Western Australia were compared with settling velocities calculated using results from several common grain-size analysis techniques (sieve, laser diffraction and image analysis) and established models. The effects of sediment density and shape were also examined using a range of density values and three different models of settling velocity. Sediment density was found to have a significant effect on calculated settling velocity, causing a range in normalized root-mean-square error of up to 28%, depending upon settling velocity model and grain-size method. Accounting for particle shape reduced errors in predicted settling velocity by 3% to 6% and removed any velocity-dependent bias, which is particularly important for the fastest settling fractions. When shape was accounted for and measured density was used, normalized root-mean-square errors were 4%, 10% and 18% for laser diffraction, sieve and image analysis, respectively. The results of this study show that established models of settling velocity that account for particle shape can be used to estimate settling velocity of irregularly shaped, sand-sized bioclastic sediments from sieve, laser diffraction, or image analysis-derived measures of grain size with a limited amount of error. Collectively, these findings will allow for grain-size data measured with different methods to be accurately converted to settling velocity for comparison. This will facilitate greater understanding of the hydraulic properties of bioclastic sediment which can help to increase our general knowledge of sediment dynamics in these environments.

  17. The effect of technique change on knee loads during sidestep cutting.

    PubMed

    Dempsey, Alasdair R; Lloyd, David G; Elliott, Bruce C; Steele, Julie R; Munro, Bridget J; Russo, Kylie A

    2007-10-01

    To identify the effect of modifying sidestep cutting technique on knee loads and predict what impact such change would have on the risk of noncontact anterior cruciate ligament injury. A force platform and motion-analysis system were used to record ground-reaction forces and track the trajectories of markers on 15 healthy males performing sidestep cutting tasks using their normal technique and nine different imposed techniques. A kinematic and inverse dynamic model was used to calculate the three-dimensional knee postures and moments. The imposed techniques of foot wide and torso leaning in the opposite direction to the cut resulted in increased peak valgus moments experienced in weight acceptance. Higher peak internal rotation moments were found for the foot wide and torso rotation in the opposite direction to the cut techniques. The foot rotated in technique resulted in lower mean flexion/extension moments, whereas the foot wide condition resulted in higher mean flexion/extension moments. The flexed knee, torso rotated in the opposite direction to the cut and torso leaning in the same direction as the cut techniques had significantly more knee flexion at heel strike. Sidestep cutting technique had a significant effect on loads experienced at the knee. The techniques that produced higher valgus and internal rotation moments at the knee, such as foot wide, torso leaning in the opposite direction to the cut and torso rotating in the opposite direction to the cut, may place an athlete at higher risk of injury because these knee loads have been shown to increase the strain on the anterior cruciate ligament. Training athletes to avoid such body positions may result in a reduced risk of noncontact anterior cruciate ligament injures.

  18. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    NASA Technical Reports Server (NTRS)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  19. Statistical Evaluation of Time Series Analysis Techniques

    NASA Technical Reports Server (NTRS)

    Benignus, V. A.

    1973-01-01

    The performance of a modified version of NASA's multivariate spectrum analysis program is discussed. A multiple regression model was used to make the revisions. Performance improvements were documented and compared to the standard fast Fourier transform by Monte Carlo techniques.

  20. An Effective Technique for Enhancing an Intrauterine Catheter Fetal Electrocardiogram

    NASA Astrophysics Data System (ADS)

    Horner, Steven L.; Holls, William M.

    2003-12-01

    Physician can obtain fetal heart rate, electrophysiological information, and uterine contraction activity for determining fetal status from an intrauterine catheters electrocardiogram with the maternal electrocardiogram canceled. In addition, the intrauterine catheter would allow physicians to acquire fetal status with one non-invasive to the fetus biosensor as compared to invasive to the fetus scalp electrode and intrauterine pressure catheter used currently. A real-time maternal electrocardiogram cancellation technique of the intrauterine catheters electrocardiogram will be discussed along with an analysis for the methods effectiveness with synthesized and clinical data. The positive results from an original detailed subjective and objective analysis of synthesized and clinical data clearly indicate that the maternal electrocardiogram cancellation method was found to be effective. The resulting intrauterine catheters electrocardiogram from effectively canceling the maternal electrocardiogram could be used for determining fetal heart rate, fetal electrocardiogram electrophysiological information, and uterine contraction activity.

  1. Recommended techniques for effective maintainability. A continuous improvement initiative of the NASA Reliability and Maintainability Steering Committee

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This manual presents a series of recommended techniques that can increase overall operational effectiveness of both flight and ground based NASA systems. It provides a set of tools that minimizes risk associated with: (1) restoring failed functions (both ground and flight based); (2) conducting complex and highly visible maintenance operations; and (3) sustaining a technical capability to support the NASA mission using aging equipment or facilities. It considers (1) program management - key elements of an effective maintainability effort; (2) design and development - techniques that have benefited previous programs; (3) analysis and test - quantitative and qualitative analysis processes and testing techniques; and (4) operations and operational design techniques that address NASA field experience. This document is a valuable resource for continuous improvement ideas in executing the systems development process in accordance with the NASA 'better, faster, smaller, and cheaper' goal without compromising safety.

  2. Application of pattern recognition techniques to crime analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  3. Maintenance Audit through Value Analysis Technique: A Case Study

    NASA Astrophysics Data System (ADS)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  4. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. A comparison of solute-transport solution techniques and their effect on sensitivity analysis and inverse modeling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2001-01-01

    Five common numerical techniques for solving the advection-dispersion equation (finite difference, predictor corrector, total variation diminishing, method of characteristics, and modified method of characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using discrete, randomly distributed, homogeneous blocks of five sand types. This experimental model provides an opportunity to compare the solution techniques: the heterogeneous hydraulic-conductivity distribution of known structure can be accurately represented by a numerical model, and detailed measurements can be compared with simulated concentrations and total flow through the tank. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation given the different methods of simulating solute transport. The breakthrough curves show that simulated peak concentrations, even at very fine grid spacings, varied between the techniques because of different amounts of numerical dispersion. Sensitivity-analysis results revealed: (1) a high correlation between hydraulic conductivity and porosity given the concentration and flow observations used, so that both could not be estimated; and (2) that the breakthrough curve data did not provide enough information to estimate individual values of dispersivity for the five sands. This study demonstrates that the choice of assigned dispersivity and the amount of numerical dispersion present in the solution technique influence estimated hydraulic conductivity values to a surprising degree.

  6. Business Case Analysis: Continuous Integrated Logistics Support-Targeted Allowance Technique (CILS-TAT)

    DTIC Science & Technology

    2013-05-30

    In this research, we examine the Naval Sea Logistics Command’s Continuous Integrated Logistics Support-Targeted Allowancing Technique (CILS-TAT) and... the feasibility of program re-implementation. We conduct an analysis of this allowancing method’s effectiveness onboard U.S. Navy Ballistic Missile...Defense (BMD) ships, measure the costs associated with performing a CILS-TAT, and provide recommendations concerning possible improvements to the

  7. Performance analysis of clustering techniques over microarray data: A case study

    NASA Astrophysics Data System (ADS)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  8. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    NASA Technical Reports Server (NTRS)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  9. Metabolomic analysis using porcine skin: a pilot study of analytical techniques.

    PubMed

    Wu, Julie; Fiehn, Oliver; Armstrong, April W

    2014-06-15

    Metabolic byproducts serve as indicators of the chemical processes and can provide valuable information on pathogenesis by measuring the amplified output. Standardized techniques for metabolome extraction of skin samples serve as a critical foundation to this field but have not been developed. We sought to determine the optimal cell lysage techniques for skin sample preparation and to compare GC-TOF-MS and UHPLC-QTOF-MS for metabolomic analysis. Using porcine skin samples, we pulverized the skin via various combinations of mechanical techniques for cell lysage. After extraction, the samples were subjected to GC-TOF-MS and/or UHPLC-QTOF-MS. Signal intensities from GC-TOF-MS analysis showed that ultrasonication (2.7x107) was most effective for cell lysage when compared to mortar-and-pestle (2.6x107), ball mill followed by ultrasonication (1.6x107), mortar-and-pestle followed by ultrasonication (1.4x107), and homogenization (trial 1: 8.4x106; trial 2: 1.6x107). Due to the similar signal intensities, ultrasonication and mortar-and-pestle were applied to additional samples and subjected to GC-TOF-MS and UHPLC-QTOF-MS. Ultrasonication yielded greater signal intensities than mortar-and-pestle for 92% of detected metabolites following GC-TOF-MS and for 68% of detected metabolites following UHPLC-QTOF-MS. Overall, ultrasonication is the preferred method for efficient cell lysage of skin tissue for both metabolomic platforms. With standardized sample preparation, metabolomic analysis of skin can serve as a powerful tool in elucidating underlying biological processes in dermatological conditions.

  10. Analysis technique for controlling system wavefront error with active/adaptive optics

    NASA Astrophysics Data System (ADS)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  11. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  12. Energy resolution improvement of CdTe detectors by using the principal component analysis technique

    NASA Astrophysics Data System (ADS)

    Alharbi, T.

    2018-02-01

    In this paper, we report on the application of the Principal Component Analysis (PCA) technique for the improvement of the γ-ray energy resolution of CdTe detectors. The PCA technique is used to estimate the amount of charge-trapping effect which is reflected in the shape of each detector pulse, thereby correcting for the charge-trapping effect. The details of the method are described and the results obtained with a CdTe detector are shown. We have achieved an energy resolution of 1.8 % (FWHM) at 662 keV with full detection efficiency from a 1 mm thick CdTe detector which gives an energy resolution of 4.5 % (FWHM) by using the standard pulse processing method.

  13. Computer-delivered interventions for reducing alcohol consumption: meta-analysis and meta-regression using behaviour change techniques and theory.

    PubMed

    Black, Nicola; Mullan, Barbara; Sharpe, Louise

    2016-09-01

    The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.

  14. Electrical Load Profile Analysis Using Clustering Techniques

    NASA Astrophysics Data System (ADS)

    Damayanti, R.; Abdullah, A. G.; Purnama, W.; Nandiyanto, A. B. D.

    2017-03-01

    Data mining is one of the data processing techniques to collect information from a set of stored data. Every day the consumption of electricity load is recorded by Electrical Company, usually at intervals of 15 or 30 minutes. This paper uses a clustering technique, which is one of data mining techniques to analyse the electrical load profiles during 2014. The three methods of clustering techniques were compared, namely K-Means (KM), Fuzzy C-Means (FCM), and K-Means Harmonics (KHM). The result shows that KHM is the most appropriate method to classify the electrical load profile. The optimum number of clusters is determined using the Davies-Bouldin Index. By grouping the load profile, the demand of variation analysis and estimation of energy loss from the group of load profile with similar pattern can be done. From the group of electric load profile, it can be known cluster load factor and a range of cluster loss factor that can help to find the range of values of coefficients for the estimated loss of energy without performing load flow studies.

  15. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  16. Methodology for assessing the effectiveness of access management techniques : final report, September 14, 1998.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  17. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  18. Flash Infrared Thermography Contrast Data Analysis Technique

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  19. Low energy analysis techniques for CUORE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alduino, C.; Alfonso, K.; Artusa, D. R.

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  20. Low energy analysis techniques for CUORE

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-12-12

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of 130Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. Here in this article, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, amore » single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils inCUORE-0.« less

  1. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  2. [Development of sample pretreatment techniques-rapid detection coupling methods for food security analysis].

    PubMed

    Huang, Yichun; Ding, Weiwei; Zhang, Zhuomin; Li, Gongke

    2013-07-01

    This paper summarizes the recent developments of the rapid detection methods for food security, such as sensors, optical techniques, portable spectral analysis, enzyme-linked immunosorbent assay, portable gas chromatograph, etc. Additionally, the applications of these rapid detection methods coupled with sample pretreatment techniques in real food security analysis are reviewed. The coupling technique has the potential to provide references to establish the selective, precise and quantitative rapid detection methods in food security analysis.

  3. Fusing modeling techniques to support domain analysis for reuse opportunities identification

    NASA Technical Reports Server (NTRS)

    Hall, Susan Main; Mcguire, Eileen

    1993-01-01

    Functional modeling techniques or object-oriented graphical representations, which are more useful to someone trying to understand the general design or high level requirements of a system? For a recent domain analysis effort, the answer was a fusion of popular modeling techniques of both types. By using both functional and object-oriented techniques, the analysts involved were able to lean on their experience in function oriented software development, while taking advantage of the descriptive power available in object oriented models. In addition, a base of familiar modeling methods permitted the group of mostly new domain analysts to learn the details of the domain analysis process while producing a quality product. This paper describes the background of this project and then provides a high level definition of domain analysis. The majority of this paper focuses on the modeling method developed and utilized during this analysis effort.

  4. Meta-analysis of studies assessing the efficacy of projective techniques in discriminating child sexual abuse.

    PubMed

    West, M M

    1998-11-01

    This meta-analysis of 12 studies assesses the efficacy of projective techniques to discriminate between sexually abused children and nonsexually abused children. A literature search was conducted to identify published studies that used projective instruments with sexually abused children. Those studies that reported statistics that allowed for an effect size to be calculated, were then included in the meta-analysis. There were 12 studies that fit the criteria. The projectives reviewed include The Rorschach, The Hand Test, The Thematic Apperception Test (TAT), the Kinetic Family Drawings, Human Figure Drawings, Draw Your Favorite Kind of Day, The Rosebush: A Visualization Strategy, and The House-Tree-Person. The results of this analysis gave an over-all effect size of d = .81, which is a large effect. Six studies included only a norm group of nondistressed, nonabused children with the sexual abuse group. The average effect size was d = .87, which is impressive. Six studies did include a clinical group of distressed nonsexually abused subjects and the effect size lowered to d = .76, which is a medium to large effect. This indicates that projective instruments can discriminate distressed children from nondistressed subjects, quite well. In the studies that included a clinical group of distressed children who were not sexually abused, the lower effect size indicates that the instruments were less able to discriminate the type of distress. This meta-analysis gives evidence that projective techniques have the ability to discriminate between children who have been sexually abused and those who were not abused sexually. However, further research that is designed to include clinical groups of distressed children is needed in order to determine how well the projectives can discriminate the type of distress.

  5. BATSE analysis techniques for probing the GRB spatial and luminosity distributions

    NASA Technical Reports Server (NTRS)

    Hakkila, Jon; Meegan, Charles A.

    1992-01-01

    The Burst And Transient Source Experiment (BATSE) has measured homogeneity and isotropy parameters from an increasingly large sample of observed gamma-ray bursts (GRBs), while also maintaining a summary of the way in which the sky has been sampled. Measurement of both of these are necessary for any study of the BATSE data statistically, as they take into account the most serious observational selection effects known in the study of GRBs: beam-smearing and inhomogeneous, anisotropic sky sampling. Knowledge of these effects is important to analysis of GRB angular and intensity distributions. In addition to determining that the bursts are local, it is hoped that analysis of such distributions will allow boundaries to be placed on the true GRB spatial distribution and luminosity function. The technique for studying GRB spatial and luminosity distributions is direct. Results of BATSE analyses are compared to Monte Carlo models parameterized by a variety of spatial and luminosity characteristics.

  6. Evaluating the Effect of Three Water Management Techniques on Tomato Crop.

    PubMed

    Elnesr, Mohammad Nabil; Alazba, Abdurrahman Ali; Zein El-Abedein, Assem Ibrahim; El-Adl, Mahmoud Maher

    2015-01-01

    The effects of three water management techniques were evaluated on subsurface drip irrigated tomatoes. The three techniques were the intermittent flow (3 pulses), the dual-lateral drip system (two lateral lines per row, at 15 and 25 cm below soil surface), and the physical barrier (buried at 30 cm below soil surface). Field experiments were established for two successive seasons. Water movement in soil was monitored using continuously logging capacitance probes up to 60 cm depth. The results showed that the dual lateral technique positively increased the yield up to 50%, water use efficiency up to 54%, while the intermittent application improved some of the quality measures (fruit size, TSS, and Vitamin C), not the quantity of the yield that decreased in one season, and not affected in the other. The physical barrier has no significant effect on any of the important growth measures. The soil water patterns showed that the dual lateral method lead to uniform wetting pattern with depth up to 45 cm, the physical barrier appeared to increase lateral and upward water movement, while the intermittent application kept the wetting pattern at higher moisture level for longer time. The cost analysis showed also that the economic treatments were the dual lateral followed by the intermittent technique, while the physical barrier is not economical. The study recommends researching the effect of the dual lateral method on the root growth and performance. The intermittent application may be recommended to improve tomato quality but not quantity. The physical barrier is not recommended unless in high permeable soils.

  7. Optimization Techniques for Analysis of Biological and Social Networks

    DTIC Science & Technology

    2012-03-28

    analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms , test and fine...alternative mathematical programming formulations, their theoretical analysis, the development of exact algorithms , and heuristics. Originally, clusters...systematic fashion under a unifying theoretical and algorithmic framework. Optimization, Complex Networks, Social Network Analysis, Computational

  8. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  9. The Recoverability of P-Technique Factor Analysis

    ERIC Educational Resources Information Center

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  10. Techniques for the Analysis of Human Movement.

    ERIC Educational Resources Information Center

    Grieve, D. W.; And Others

    This book presents the major analytical techniques that may be used in the appraisal of human movement. Chapter 1 is devoted to the photopgraphic analysis of movement with particular emphasis on cine filming. Cine film may be taken with little or no restriction on the performer's range of movement; information on the film is permanent and…

  11. Numerical modeling techniques for flood analysis

    NASA Astrophysics Data System (ADS)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  12. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    NASA Astrophysics Data System (ADS)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  13. Optical Kerr effect in graphene: Theoretical analysis of the optical heterodyne detection technique

    NASA Astrophysics Data System (ADS)

    Savostianova, N. A.; Mikhailov, S. A.

    2018-04-01

    Graphene is an atomically thin two-dimensional material demonstrating strong optical nonlinearities, including harmonics generation, four-wave mixing, Kerr, and other nonlinear effects. In this paper we theoretically analyze the optical heterodyne detection (OHD) technique of measuring the optical Kerr effect (OKE) in two-dimensional crystals and show how to relate the quantities measured in such experiments with components of the third-order conductivity tensor σαβ γ δ (3 )(ω1,ω2,ω3) of the two-dimensional crystal. Using results of a recently developed quantum theory of the third-order nonlinear electrodynamic response of graphene, we analyze the frequency, charge carrier density, temperature, and other dependencies of the OHD-OKE response of this material. We compare our results with a recent OHD-OKE experiment in graphene and find good agreement between the theory and experiment.

  14. A Sensitivity Analysis of Circular Error Probable Approximation Techniques

    DTIC Science & Technology

    1992-03-01

    SENSITIVITY ANALYSIS OF CIRCULAR ERROR PROBABLE APPROXIMATION TECHNIQUES THESIS Presented to the Faculty of the School of Engineering of the Air Force...programming skills. Major Paul Auclair patiently advised me in this endeavor, and Major Andy Howell added numerous insightful contributions. I thank my...techniques. The two ret(st accuratec techniiques require numerical integration and can take several hours to run ov a personal comlputer [2:1-2,4-6]. Some

  15. Analysis of Hospital Processes with Process Mining Techniques.

    PubMed

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  16. Some failure modes and analysis techniques for terrestrial solar cell modules

    NASA Technical Reports Server (NTRS)

    Shumka, A.; Stern, K. H.

    1978-01-01

    Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.

  17. Reliability analysis of a robotic system using hybridized technique

    NASA Astrophysics Data System (ADS)

    Kumar, Naveen; Komal; Lather, J. S.

    2017-09-01

    In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.

  18. Effect of implant angulation and impression technique on impressions of NobelActive implants.

    PubMed

    Alexander Hazboun, Gillian Brewer; Masri, Radi; Romberg, Elaine; Kempler, Joanna; Driscoll, Carl F

    2015-05-01

    How the configuration of the NobelActive internal conical connection affects implant impressions is uncertain. The purpose of this study was to measure the effect in vitro of closed and open tray impression techniques for NobelActive implants placed at various angulations. Six NobelActive implants were placed in a master maxillary cast as follows: 0 degrees of angulation to a line drawn perpendicular to the occlusal plane in the first molar area, 15 degrees of angulation to a line drawn perpendicular to the occlusal plane in the first premolar area, and 30 degrees of angulation to a line drawn perpendicular to the occlusal plane in the lateral incisor area. Twelve open tray and 12 closed tray impressions were made. Occlusal, lateral, and frontal view photographs of the resulting casts were used to measure the linear and angular displacement of implant analogs. Statistical analysis was performed with a factorial analysis of variance (ANOVA), followed by the Tukey HSD test (α=.05). No significant difference was found in the impressions made of NobelActive implants with the open or closed tray technique (linear displacement: F=0.93, P=.34; angular displacement: F=2.09, P=.15). In addition, implant angulation (0, 15, or 30 degrees) had no effect on the linear or angular displacement of impressions (linear displacement: F=2.72, P=.07; angular displacement: F=0.86, P=.43). Finally, no significant interaction was found between impression technique and implant angulation on NobelActive implants (F=0.25, P=.77; F=1.60, P=.20). Within the limitations of this study, impression technique (open vs closed tray) and implant angulation (0, 15, and 30 degrees) had no significant effect on in vitro impressions of NobelActive implants. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  19. Analysis techniques for tracer studies of oxidation. M. S. Thesis Final Report

    NASA Technical Reports Server (NTRS)

    Basu, S. N.

    1984-01-01

    Analysis techniques to obtain quantitative diffusion data from tracer concentration profiles were developed. Mass balance ideas were applied to determine the mechanism of oxide growth and to separate the fraction of inward and outward growth of oxide scales. The process of inward oxygen diffusion with exchange was theoretically modelled and the effect of lattice diffusivity, grain boundary diffusivity and grain size on the tracer concentration profile was studied. The development of the tracer concentration profile in a growing oxide scale was simulated. The double oxidation technique was applied to a FeCrAl-Zr alloy using 0-18 as a tracer. SIMS was used to obtain the tracer concentration profile. The formation of lacey oxide on the alloy was discussed. Careful consideration was given to the quality of data required to obtain quantitative information.

  20. Three Techniques for Task Analysis: Examples from the Nuclear Utilities.

    ERIC Educational Resources Information Center

    Carlisle, Kenneth E.

    1984-01-01

    Discusses three task analysis techniques utilized at the Palo Verde Nuclear Generating Station to review training programs: analysis of (1) job positions, (2) procedures, and (3) instructional presentations. All of these include task breakdown, relationship determination, and task restructuring. (MBR)

  1. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    USGS Publications Warehouse

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  2. Development of techniques for the analysis of isoflavones in soy foods and nutraceuticals.

    PubMed

    Dentith, Susan; Lockwood, Brian

    2008-05-01

    For over 20 years, soy isoflavones have been investigated for their ability to prevent a wide range of cancers and cardiovascular problems, and numerous other disease states. This research is underpinned by the ability of researchers to analyse isoflavones in various forms in a range of raw materials and biological fluids. This review summarizes the techniques recently used in their analysis. The speed of high performance liquid chromatography analysis has been improved, allowing analysis of more samples, and increasing the sensitivity of detection techniques allows quantification of isoflavones down to nanomoles per litre levels in biological fluids. The combination of high-performance liquid chromatography with immunoassay has allowed identification and estimation of low-level soy isoflavones. The use of soy isoflavone supplements has shown an increase in their circulating levels in plasma and urine, aiding investigation of their biological effects. The significance of the metabolite equol has spurned research into new areas, and recently the specific enantiomers have been studied. High-performance liquid chromatography, capillary electrophoresis and gas chromatography are widely used with a range of detection systems. Increasingly, immunoassay is being used because of its high sensitivity and low cost.

  3. NUMERICAL ANALYSIS TECHNIQUE USING THE STATISTICAL ENERGY ANALYSIS METHOD CONCERNING THE BLASTING NOISE REDUCTION BY THE SOUND INSULATION DOOR USED IN TUNNEL CONSTRUCTIONS

    NASA Astrophysics Data System (ADS)

    Ishida, Shigeki; Mori, Atsuo; Shinji, Masato

    The main method to reduce the blasting charge noise which occurs in a tunnel under construction is to install the sound insulation door in the tunnel. However, the numerical analysis technique to predict the accurate effect of the transmission loss in the sound insulation door is not established. In this study, we measured the blasting charge noise and the vibration of the sound insulation door in the tunnel with the blasting charge, and performed analysis and modified acoustic feature. In addition, we reproduced the noise reduction effect of the sound insulation door by statistical energy analysis method and confirmed that numerical simulation is possible by this procedure.

  4. Micropowder collecting technique for stable isotope analysis of carbonates.

    PubMed

    Sakai, Saburo; Kodan, Tsuyoshi

    2011-05-15

    Micromilling is a conventional technique used in the analysis of the isotopic composition of geological materials, which improves the spatial resolution of sample collection for analysis. However, a problem still remains concerning the recovery ratio of the milled sample. We constructed a simple apparatus consisting of a vacuum pump, a sintered metal filter, electrically conductive rubber stopper and a stainless steel tube for transferring the milled powder into a reaction vial. In our preliminary experiments on carbonate powder, we achieved a rapid recovery of 5 to 100 µg of carbonate with a high recovery ratio (>90%). This technique shortens the sample preparation time, improves the recovery ratio, and homogenizes the sample quantity, which, in turn, improves the analytical reproducibility. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Numerical analysis of thermal drilling technique on titanium sheet metal

    NASA Astrophysics Data System (ADS)

    Kumar, R.; Hynes, N. Rajesh Jesudoss

    2018-05-01

    Thermal drilling is a technique used in drilling of sheet metal for various applications. It involves rotating conical tool with high speed in order to drill the sheet metal and formed a hole with bush below the surface of sheet metal. This article investigates the finite element analysis of thermal drilling on Ti6Al4Valloy sheet metal. This analysis was carried out by means of DEFORM-3D simulation software to simulate the performance characteristics of thermal drilling technique. Due to the contribution of high temperature deformation in this technique, the output performances which are difficult to measure by the experimental approach, can be successfully achieved by finite element method. Therefore, the modeling and simulation of thermal drilling is an essential tool to predict the strain rate, stress distribution and temperature of the workpiece.

  6. Flow Injection Technique for Biochemical Analysis with Chemiluminescence Detection in Acidic Media

    PubMed Central

    Chen, Jing; Fang, Yanjun

    2007-01-01

    A review with 90 references is presented to show the development of acidic chemiluminescence methods for biochemical analysis by use of flow injection technique in the last 10 years. A brief discussion of both the chemiluminescence and flow injection technique is given. The proposed methods for biochemical analysis are described and compared according to the used chemiluminescence system.

  7. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  8. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    EPA Science Inventory

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  9. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V.

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural,more » technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and

  10. Optical Design And Analysis Of Carbon Dioxide Laser Fusion Systems Using Interferometry And Fast Fourier Transform Techniques

    NASA Astrophysics Data System (ADS)

    Viswanathan, V. K.

    1980-11-01

    The optical design and analysis of the LASL carbon dioxide laser fusion systems required the use of techniques that are quite different from the currently used method in conventional optical design problems. The necessity for this is explored and the method that has been successfully used at Los Alamos to understand these systems is discussed with examples. This method involves characterization of the various optical components in their mounts by a Zernike polynomial set and using fast Fourier transform techniques to propagate the beam, taking diffraction and other nonlinear effects that occur in these types of systems into account. The various programs used for analysis are briefly discussed.

  11. Using Job Analysis Techniques to Understand Training Needs for Promotores de Salud.

    PubMed

    Ospina, Javier H; Langford, Toshiko A; Henry, Kimberly L; Nelson, Tristan Q

    2018-04-01

    Despite the value of community health worker programs, such as Promotores de Salud, for addressing health disparities in the Latino community, little consensus has been reached to formally define the unique roles and duties associated with the job, thereby creating unique job training challenges. Understanding the job tasks and worker attributes central to this work is a critical first step for developing the training and evaluation systems of promotores programs. Here, we present the process and findings of a job analysis conducted for promotores working for Planned Parenthood. We employed a systematic approach, the combination job analysis method, to define the job in terms of its work and worker requirements, identifying key job tasks, as well as the worker attributes necessary to effectively perform them. Our results suggest that the promotores' job encompasses a broad range of activities and requires an equally broad range of personal characteristics to perform. These results played an important role in the development of our training and evaluation protocols. In this article, we introduce the technique of job analysis, provide an overview of the results from our own application of this technique, and discuss how these findings can be used to inform a training and performance evaluation system. This article provides a template for other organizations implementing similar community health worker programs and illustrates the value of conducting a job analysis for clarifying job roles, developing and evaluating job training materials, and selecting qualified job candidates.

  12. 48 CFR 215.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... reliability of its estimating and accounting systems. [63 FR 55040, Oct. 14, 1998, as amended at 71 FR 69494... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  13. [Surgical renal biopsies: technique, effectiveness and complications].

    PubMed

    Pinsach Elías, L; Blasco Casares, F J; Ibarz Servió, L; Valero Milián, J; Areal Calama, J; Bucar Terrades, S; Saladié Roig, J M

    1991-01-01

    Retrospective study made on 140 renal surgical biopsies (RSB) performed throughout the past 4 years in our Unit. The technique's effectiveness and morbidity are emphasized and the surgical technique and type of anaesthesia described. The sample obtained was enough to perform an essay in 100% cases, and a diagnosis was reached in 98.5%. Thirty-nine patients (27.8%) presented complications, 13 (9.2%) of which were directly related to the surgical technique. No case required blood transfusion and no deaths were reported. The type of anaesthesia used was: local plus sedation in 104 (74.2%) cases, rachianaesthesia in 10 (7.1%) and general in 26 (18.5%). The same approach was used in all patients: minimal subcostal lumbotomy, using Wilde's forceps to obtain the samples. It is believed that RSB is a highly effective, low mortality procedure, easy and quick to perform, and suitable for selected patients.

  14. A Cost-Effectiveness/Benefit Analysis Model for Postsecondary Vocational Programs. Technical Report.

    ERIC Educational Resources Information Center

    Kim, Jin Eun

    A cost-effectiveness/benefit analysis is defined as a technique for measuring the outputs of existing and new programs in relation to their specified program objectives, against the costs of those programs. In terms of its specific use, the technique is conceptualized as a systems analysis method, an evaluation method, and a planning tool for…

  15. Thermal Analysis of Brazing Seal and Sterilizing Technique to Break Contamination Chain for Mars Sample Return

    NASA Technical Reports Server (NTRS)

    Bao, Xiaoqi; Badescu, Mircea; Bar-Cohen, Yoseph

    2015-01-01

    The potential to return Martian samples to Earth for extensive analysis is in great interest of the planetary science community. It is important to make sure the mission would securely contain any microbes that may possibly exist on Mars so that they would not be able to cause any adverse effects on Earth's environment. A brazing sealing and sterilizing technique has been proposed to break the Mars-to-Earth contamination chain. Thermal analysis of the brazing process was conducted for several conceptual designs that apply the technique. Control of the increase of the temperature of the Martian samples is a challenge. The temperature profiles of the Martian samples being sealed in the container were predicted by finite element thermal models. The results show that the sealing and sterilization process can be controlled such that the samples' temperature is maintained below the potentially required level, and that the brazing technique is a feasible approach to break the contamination chain.

  16. An assessment of finite-element modeling techniques for thick-solid/thin-shell joints analysis

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Androlake, S. G.

    1993-01-01

    The subject of finite-element modeling has long been of critical importance to the practicing designer/analyst who is often faced with obtaining an accurate and cost-effective structural analysis of a particular design. Typically, these two goals are in conflict. The purpose is to discuss the topic of finite-element modeling for solid/shell connections (joints) which are significant for the practicing modeler. Several approaches are currently in use, but frequently various assumptions restrict their use. Such techniques currently used in practical applications were tested, especially to see which technique is the most ideally suited for the computer aided design (CAD) environment. Some basic thoughts regarding each technique are also discussed. As a consequence, some suggestions based on the results are given to lead reliable results in geometrically complex joints where the deformation and stress behavior are complicated.

  17. Effect of Instrumentation Techniques and Preparation Taper on Apical Extrusion of Bacteria.

    PubMed

    Aksel, Hacer; Küçükkaya Eren, Selen; Çakar, Aslı; Serper, Ahmet; Özkuyumcu, Cumhur; Azim, Adham A

    2017-06-01

    The aim of this in vitro study was to evaluate the effects of different root canal instrumentation techniques and preparation tapers on the amount of apically extruded bacteria. The root canals of 98 extracted human mandibular incisors were contaminated with Enterococcus faecalis suspension. After incubation at 37°C for 24 hours, the root canals were instrumented with K3 rotary files in a crown-down (CD) or full-length linear instrumentation technique (FL) by using 3 different root canal tapers (0.02, 0.04, and 0.06). During instrumentation, apically extruded bacteria were collected into vials containing saline solution. The microbiological samples were taken from the vials and incubated in brain-heart agar medium for 24 hours, and the numbers of colony-forming units (CFUs) were determined. The obtained results were analyzed with t test and one-way analysis of variance for the comparisons between the instrumentation techniques (CD and FL) and the preparation tapers (0.02, 0.04, and 0.06), respectively. Tukey honestly significant difference test was used for pairwise comparisons. The preparation taper had no effect on the number of CFUs when a FL instrumentation technique was used (P > .05). There was a statistically significant difference in the CFUs between FL and CD techniques when the preparation taper was 0.02 (P < .05). There was no statistically significant difference between the 0.04 and 0.06 preparation tapers in any of the instrumentation techniques (P > .05). Using a 0.02 taper in a CD manner results in the least amount of bacterial extrusion. The instrumentation technique did not seem to affect the amount of bacterial extrusion when 0.04 and 0.06 taper instruments were used for cleaning and shaping the root canal space. Published by Elsevier Inc.

  18. The effect of two mobilization techniques on dorsiflexion in people with chronic ankle instability.

    PubMed

    Marrón-Gómez, David; Rodríguez-Fernández, Ángel L; Martín-Urrialde, José A

    2015-02-01

    To compare the effect of two manual therapy techniques, mobilization with movement (WB-MWM) and talocrural manipulation (HVLA), for the improvement of ankle dorsiflexion in people with chronic ankle instability (CAI) over 48 h. Randomized controlled clinical trial. University research laboratory. Fifty-two participants (mean ± SD age, 20.7 ± 3.4 years) with CAI were randomized to WB-MWM (n = 18), HVLA (n = 19) or placebo group (n = 15). Weight-bearing ankle dorsiflexion measured with the weight-bearing lunge. Measurements were obtained prior to intervention, immediately after intervention, and 10 min, 24 h and 48 h post-intervention. There was a significant effect × time (F4,192 = 20.65; P < 0.001) and a significant time × group interactions (F8,192 = 6.34; P < 0.001). Post hoc analysis showed a significant increase of ankle dorsiflexion in both WB-MWM and HVLA groups with respect to the placebo group with no differences between both active treatment groups. A single application of WB-MWM or HVLA manual technique improves ankle dorsiflexion in people with CAI, and the effects persist for at least two days. Both techniques have similar effectiveness for improving ankle dorsiflexion although WB-MWM demonstrated greater effect sizes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Quadrant Analysis as a Strategic Planning Technique in Curriculum Development and Program Marketing.

    ERIC Educational Resources Information Center

    Lynch, James; And Others

    1996-01-01

    Quadrant analysis, a widely-used research technique, is suggested as useful in college or university strategic planning. The technique uses consumer preference data and produces information suitable for a wide variety of curriculum and marketing decisions. Basic quadrant analysis design is described, and advanced variations are discussed, with…

  20. Impact during equine locomotion: techniques for measurement and analysis.

    PubMed

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  1. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    PubMed Central

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  2. Image Analysis Technique for Material Behavior Evaluation in Civil Structures.

    PubMed

    Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca

    2017-07-08

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.

  3. An algol program for dissimilarity analysis: a divisive-omnithetic clustering technique

    USGS Publications Warehouse

    Tipper, J.C.

    1979-01-01

    Clustering techniques are used properly to generate hypotheses about patterns in data. Of the hierarchical techniques, those which are divisive and omnithetic possess many theoretically optimal properties. One such method, dissimilarity analysis, is implemented here in ALGOL 60, and determined to be competitive computationally with most other methods. ?? 1979.

  4. Cost Effective Repair Techniques for Turbine Airfoils. Volume 2

    DTIC Science & Technology

    1979-04-01

    BLADES , *GUIDE VANES , *REPAIR, TURBOFAN ENGINES , DIFFUSION BONDING, COST EFFECTIVENESS Identifiers: (U) * Turbine vanes , TF-39 engines , Activated...REPAIR TECHNIQUES FOR TURBINE AIRFOILS J. A. WEIN W. R. YOUNG GENERAL ELECTRIC COMPANY AIRCRAFT ENGINE GROUP CINCINNATI, OHIO 45215 APRIL 1979...Author: GENERAL ELECTRIC CO CINCINNATI OH AIRCRAFT ENGINE BUSINESS GROUP Unclassified Title: (U) Cost Effective Repair Techniques for

  5. Application of a sensitivity analysis technique to high-order digital flight control systems

    NASA Technical Reports Server (NTRS)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  6. Effects of implant system, impression technique, and impression material on accuracy of the working cast.

    PubMed

    Wegner, Kerstin; Weskott, Katharina; Zenginel, Martha; Rehmann, Peter; Wöstmann, Bernd

    2013-01-01

    This in vitro study aimed to identify the effects of the implant system, impression technique, and impression material on the transfer accuracy of implant impressions. The null hypothesis tested was that, in vitro and within the parameters of the experiment, the spatial relationship of a working cast to the placement of implants is not related to (1) the implant system, (2) the impression technique, or (3) the impression material. A steel maxilla was used as a reference model. Six implants of two different implant systems (Standard Plus, Straumann; Semados, Bego) were fixed in the reference model. The target variables were: three-dimensional (3D) shift in all directions, implant axis direction, and rotation. The target variables were assessed using a 3D coordinate measuring machine, and the respective deviations of the plaster models from the nominal values of the reference model were calculated. Two different impression techniques (reposition/pickup) and four impression materials (Aquasil Ultra, Flexitime, Impregum Penta, P2 Magnum 360) were investigated. In all, 80 implant impressions for each implant system were taken. Statistical analysis was performed using multivariate analysis of variance. The implant system significantly influenced the transfer accuracy for most spatial dimensions, including the overall 3D shift and implant axis direction. There was no significant difference between the two implant systems with regard to rotation. Multivariate analysis of variance showed a significant effect on transfer accuracy only for the implant system. Within the limits of the present study, it can be concluded that the transfer accuracy of the intraoral implant position on the working cast is far more dependent on the implant system than on the selection of a specific impression technique or material.

  7. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    DTIC Science & Technology

    2018-04-30

    Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...sition (KMD) techniques. The work in the program’s first month consisted of improvements to data processing code, inclusion of additional arctic sea ice

  8. Biosignals Analysis for Kidney Function Effect Analysis of Fennel Aromatherapy

    PubMed Central

    Kim, Bong-Hyun; Cho, Dong-Uk; Seo, Ssang-Hee

    2015-01-01

    Human effort in order to enjoy a healthy life is diverse. IT technology to these analyzes, the results of development efforts, it has been applied. Therefore, I use the care and maintenance diagnostic health management and prevention than treatment. In particular, the aromatherapy treatment easy to use without the side effects there is no irritation, are widely used in modern society. In this paper, we measured the aroma effect by applying a biosignal analysis techniques; an experiment was performed to analyze. In particular, we design methods and processes of research based on the theory aroma that affect renal function. Therefore, in this paper, measuring the biosignals and after fennel aromatherapy treatment prior to the enforcement of the mutual comparison, through the analysis, studies were carried out to analyze the effect of fennel aromatherapy therapy on kidney function. PMID:25977696

  9. Effective Management Selection: The Analysis of Behavior by Simulation Techniques.

    ERIC Educational Resources Information Center

    Jaffee, Cabot L.

    This book presents a system by which feedback might be generated and used as a basis for organizational change. The major areas covered consist of the development of a rationale for the use of simulation in the selection of supervisors, a description of actual techniques, and a method for training individuals in the use of the material. The…

  10. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    ERIC Educational Resources Information Center

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  11. Effective techniques in healthy eating and physical activity interventions: a meta-regression.

    PubMed

    Michie, Susan; Abraham, Charles; Whittington, Craig; McAteer, John; Gupta, Sunjai

    2009-11-01

    Meta-analyses of behavior change (BC) interventions typically find large heterogeneity in effectiveness and small effects. This study aimed to assess the effectiveness of active BC interventions designed to promote physical activity and healthy eating and investigate whether theoretically specified BC techniques improve outcome. Interventions, evaluated in experimental or quasi-experimental studies, using behavioral and/or cognitive techniques to increase physical activity and healthy eating in adults, were systematically reviewed. Intervention content was reliably classified into 26 BC techniques and the effects of individual techniques, and of a theoretically derived combination of self-regulation techniques, were assessed using meta-regression. Valid outcomes of physical activity and healthy eating. The 122 evaluations (N = 44,747) produced an overall pooled effect size of 0.31 (95% confidence interval = 0.26 to 0.36, I(2) = 69%). The technique, "self-monitoring," explained the greatest amount of among-study heterogeneity (13%). Interventions that combined self-monitoring with at least one other technique derived from control theory were significantly more effective than the other interventions (0.42 vs. 0.26). Classifying interventions according to component techniques and theoretically derived technique combinations and conducting meta-regression enabled identification of effective components of interventions designed to increase physical activity and healthy eating. PsycINFO Database Record (c) 2009 APA, all rights reserved.

  12. A CHARTING TECHNIQUE FOR THE ANALYSIS OF BUSINESS SYSTEMS,

    DTIC Science & Technology

    This paper describes a charting technique useful in the analysis of business systems and in studies of the information economics of the firm. The...planning advanced systems. It is not restricted to any particular kind of business or information system. (Author)

  13. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  14. Performance analysis of multiple PRF technique for ambiguity resolution

    NASA Technical Reports Server (NTRS)

    Chang, C. Y.; Curlander, J. C.

    1992-01-01

    For short wavelength spaceborne synthetic aperture radar (SAR), ambiguity in Doppler centroid estimation occurs when the azimuth squint angle uncertainty is larger than the azimuth antenna beamwidth. Multiple pulse recurrence frequency (PRF) hopping is a technique developed to resolve the ambiguity by operating the radar in different PRF's in the pre-imaging sequence. Performance analysis results of the multiple PRF technique are presented, given the constraints of the attitude bound, the drift rate uncertainty, and the arbitrary numerical values of PRF's. The algorithm performance is derived in terms of the probability of correct ambiguity resolution. Examples, using the Shuttle Imaging Radar-C (SIR-C) and X-SAR parameters, demonstrate that the probability of correct ambiguity resolution obtained by the multiple PRF technique is greater than 95 percent and 80 percent for the SIR-C and X-SAR applications, respectively. The success rate is significantly higher than that achieved by the range cross correlation technique.

  15. Data analysis techniques used at the Oak Ridge Y-12 plant flywheel evaluation laboratory

    NASA Astrophysics Data System (ADS)

    Steels, R. S., Jr.; Babelay, E. F., Jr.

    1980-07-01

    Some of the more advanced data analysis techniques applied to the problem of experimentally evaluating the performance of high performance composite flywheels are presented. Real time applications include polar plots of runout with interruptions relating to balance and relative motions between parts, radial growth measurements, and temperature of the spinning part. The technique used to measure torque applied to a containment housing during flywheel failure is also presented. The discussion of pre and post test analysis techniques includes resonant frequency determination with modal analysis, waterfall charts, and runout signals at failure.

  16. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  17. Evaluation of analysis techniques for low frequency interior noise and vibration of commercial aircraft

    NASA Technical Reports Server (NTRS)

    Landmann, A. E.; Tillema, H. F.; Marshall, S. E.

    1989-01-01

    The application of selected analysis techniques to low frequency cabin noise associated with advanced propeller engine installations is evaluated. Three design analysis techniques were chosen for evaluation including finite element analysis, statistical energy analysis (SEA), and a power flow method using element of SEA (computer program Propeller Aircraft Interior Noise). An overview of the three procedures is provided. Data from tests of a 727 airplane (modified to accept a propeller engine) were used to compare with predictions. Comparisons of predicted and measured levels at the end of the first year's effort showed reasonable agreement leading to the conclusion that each technique had value for propeller engine noise predictions on large commercial transports. However, variations in agreement were large enough to remain cautious and to lead to recommendations for further work with each technique. Assessment of the second year's results leads to the conclusion that the selected techniques can accurately predict trends and can be useful to a designer, but that absolute level predictions remain unreliable due to complexity of the aircraft structure and low modal densities.

  18. Effective separation technique for small diameter whiskers.

    NASA Technical Reports Server (NTRS)

    Westfall, L. J.

    1972-01-01

    Description of a technique for separating small-diameter whiskers from the as-grown matt by gently agitating the whisker matts in a solution of deionized or distilled water for six to eight hours. High-strength Al2O3 whiskers were effectively separated by this technique, comprising an average 48% of the original weight of the whisker matt. According to estimation, more than 90% of separated whiskers had diameters between 0.7 and 2.0 microns.

  19. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    ERIC Educational Resources Information Center

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  20. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  1. Recent advances in capillary electrophoretic migration techniques for pharmaceutical analysis.

    PubMed

    Deeb, Sami El; Wätzig, Hermann; El-Hady, Deia Abd; Albishri, Hassan M; de Griend, Cari Sänger-van; Scriba, Gerhard K E

    2014-01-01

    Since the introduction about 30 years ago, CE techniques have gained a significant impact in pharmaceutical analysis. The present review covers recent advances and applications of CE for the analysis of pharmaceuticals. Both small molecules and biomolecules such as proteins are considered. The applications range from the determination of drug-related substances to the analysis of counterions and the determination of physicochemical parameters. Furthermore, general considerations of CE methods in pharmaceutical analysis are described. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Mixed Models and Reduction Techniques for Large-Rotation, Nonlinear Analysis of Shells of Revolution with Application to Tires

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Andersen, C. M.; Tanner, J. A.

    1984-01-01

    An effective computational strategy is presented for the large-rotation, nonlinear axisymmetric analysis of shells of revolution. The three key elements of the computational strategy are: (1) use of mixed finite-element models with discontinuous stress resultants at the element interfaces; (2) substantial reduction in the total number of degrees of freedom through the use of a multiple-parameter reduction technique; and (3) reduction in the size of the analysis model through the decomposition of asymmetric loads into symmetric and antisymmetric components coupled with the use of the multiple-parameter reduction technique. The potential of the proposed computational strategy is discussed. Numerical results are presented to demonstrate the high accuracy of the mixed models developed and to show the potential of using the proposed computational strategy for the analysis of tires.

  3. Simultaneous Comparison of Two Roller Compaction Techniques and Two Particle Size Analysis Methods.

    PubMed

    Saarinen, Tuomas; Antikainen, Osmo; Yliruusi, Jouko

    2017-11-01

    A new dry granulation technique, gas-assisted roller compaction (GARC), was compared with conventional roller compaction (CRC) by manufacturing 34 granulation batches. The process variables studied were roll pressure, roll speed, and sieve size of the conical mill. The main quality attributes measured were granule size and flow characteristics. Within granulations also the real applicability of two particle size analysis techniques, sieve analysis (SA) and fast imaging technique (Flashsizer, FS), was tested. All granules obtained were acceptable. In general, the particle size of GARC granules was slightly larger than that of CRC granules. In addition, the GARC granules had better flowability. For example, the tablet weight variation of GARC granules was close to 2%, indicating good flowing and packing characteristics. The comparison of the two particle size analysis techniques showed that SA was more accurate in determining wide and bimodal size distributions while FS showed narrower and mono-modal distributions. However, both techniques gave good estimates for mean granule sizes. Overall, SA was a time-consuming but accurate technique that provided reliable information for the entire granule size distribution. By contrast, FS oversimplified the shape of the size distribution, but nevertheless yielded acceptable estimates for mean particle size. In general, FS was two to three orders of magnitude faster than SA.

  4. A look-ahead probabilistic contingency analysis framework incorporating smart sampling techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    2016-07-18

    This paper describes a framework of incorporating smart sampling techniques in a probabilistic look-ahead contingency analysis application. The predictive probabilistic contingency analysis helps to reflect the impact of uncertainties caused by variable generation and load on potential violations of transmission limits.

  5. Effects of must concentration techniques on wine isotopic parameters.

    PubMed

    Guyon, Francois; Douet, Christine; Colas, Sebastien; Salagoïty, Marie-Hélène; Medina, Bernard

    2006-12-27

    Despite the robustness of isotopic methods applied in the field of wine control, isotopic values can be slightly influenced by enological practices. For this reason, must concentration technique effects on wine isotopic parameters were studied. The two studied concentration techniques were reverse osmosis (RO) and high-vacuum evaporation (HVE). Samples (must and extracted water) have been collected in various French vineyards. Musts were microfermented at the laboratory, and isotope parameters were determined on the obtained wine. Deuterium and carbon-13 isotope ratios were studied on distilled ethanol by nuclear magnetic resonance (NMR) and isotope ratio mass spectrometry (IRMS), respectively. The oxygen-18 ratio was determined on extracted and wine water using IRMS apparatus. The study showed that the RO technique has a very low effect on isotopic parameters, indicating that this concentration technique does not create any isotopic fractionation, neither at sugar level nor at water level. The effect is notable for must submitted to HVE concentration: water evaporation leads to a modification of the oxygen-18 ratio of the must and, as a consequence, ethanol deuterium concentration is also modified.

  6. The combined use of order tracking techniques for enhanced Fourier analysis of order components

    NASA Astrophysics Data System (ADS)

    Wang, K. S.; Heyns, P. S.

    2011-04-01

    Order tracking is one of the most important vibration analysis techniques for diagnosing faults in rotating machinery. It can be performed in many different ways, each of these with distinct advantages and disadvantages. However, in the end the analyst will often use Fourier analysis to transform the data from a time series to frequency or order spectra. It is therefore surprising that the study of the Fourier analysis of order-tracked systems seems to have been largely ignored in the literature. This paper considers the frequently used Vold-Kalman filter-based order tracking and computed order tracking techniques. The main pros and cons of each technique for Fourier analysis are discussed and the sequential use of Vold-Kalman filtering and computed order tracking is proposed as a novel idea to enhance the results of Fourier analysis for determining the order components. The advantages of the combined use of these order tracking techniques are demonstrated numerically on an SDOF rotor simulation model. Finally, the approach is also demonstrated on experimental data from a real rotating machine.

  7. Portable X-ray fluorescence spectroscopy as a rapid screening technique for analysis of TiO2 and ZnO in sunscreens.

    PubMed

    Bairi, Venu Gopal; Lim, Jin-Hee; Quevedo, Ivan R; Mudalige, Thilak K; Linder, Sean W

    2016-02-01

    This investigation reports a rapid and simple screening technique for the quantification of titanium and zinc in commercial sunscreens using portable X-ray fluorescence spectroscopy (pXRF). A highly evolved technique, inductively coupled plasma-mass spectroscopy (ICP-MS) was chosen as a comparative technique to pXRF, and a good correlation (r 2 > 0.995) with acceptable variations (≤25%) in results between both techniques was observed. Analytical figures of merit such as detection limit, quantitation limit, and linear range of the method are reported for the pXRF technique. This method has a good linearity (r 2 > 0.995) for the analysis of titanium (Ti) in the range of 0.4-14.23 wt%, and zinc (Zn) in the range of 1.0-23.90 wt%. However, most commercial sunscreens contain organic ingredients, and these ingredients are known to cause matrix effects. The development of appropriate matrix matched working standards to obtain the calibration curve was found to be a major challenge for the pXRF measurements. In this study, we have overcome the matrix effect by using metal-free commercial sunscreens as a dispersing media for the preparation of working standards. An easy extension of this unique methodology for preparing working standards in different matrices was also reported. This method is simple, rapid, and cost-effective and, in comparison to conventional techniques (e.g., ICP-MS), did not generate toxic wastes during sample analysis.

  8. Portable X-ray fluorescence spectroscopy as a rapid screening technique for analysis of TiO2 and ZnO in sunscreens

    NASA Astrophysics Data System (ADS)

    Bairi, Venu Gopal; Lim, Jin-Hee; Quevedo, Ivan R.; Mudalige, Thilak K.; Linder, Sean W.

    2016-02-01

    This investigation reports a rapid and simple screening technique for the quantification of titanium and zinc in commercial sunscreens using portable X-ray fluorescence spectroscopy (pXRF). A highly evolved technique, inductively coupled plasma-mass spectroscopy (ICP-MS) was chosen as a comparative technique to pXRF, and a good correlation (r2 > 0.995) with acceptable variations (≤ 25%) in results between both techniques was observed. Analytical figures of merit such as detection limit, quantitation limit, and linear range of the method are reported for the pXRF technique. This method has a good linearity (r2 > 0.995) for the analysis of titanium (Ti) in the range of 0.4-14.23 wt%, and zinc (Zn) in the range of 1.0-23.90 wt%. However, most commercial sunscreens contain organic ingredients, and these ingredients are known to cause matrix effects. The development of appropriate matrix matched working standards to obtain the calibration curve was found to be a major challenge for the pXRF measurements. In this study, we have overcome the matrix effect by using metal-free commercial sunscreens as a dispersing media for the preparation of working standards. An easy extension of this unique methodology for preparing working standards in different matrices was also reported. This method is simple, rapid, and cost-effective and, in comparison to conventional techniques (e.g., ICP-MS), did not generate toxic wastes during sample analysis.

  9. A study of data analysis techniques for the multi-needle Langmuir probe

    NASA Astrophysics Data System (ADS)

    Hoang, H.; Røed, K.; Bekkeng, T. A.; Moen, J. I.; Spicher, A.; Clausen, L. B. N.; Miloch, W. J.; Trondsen, E.; Pedersen, A.

    2018-06-01

    In this paper we evaluate two data analysis techniques for the multi-needle Langmuir probe (m-NLP). The instrument uses several cylindrical Langmuir probes, which are positively biased with respect to the plasma potential in order to operate in the electron saturation region. Since the currents collected by these probes can be sampled at kilohertz rates, the instrument is capable of resolving the ionospheric plasma structure down to the meter scale. The two data analysis techniques, a linear fit and a non-linear least squares fit, are discussed in detail using data from the Investigation of Cusp Irregularities 2 sounding rocket. It is shown that each technique has pros and cons with respect to the m-NLP implementation. Even though the linear fitting technique seems to be better than measurements from incoherent scatter radar and in situ instruments, m-NLPs can be longer and can be cleaned during operation to improve instrument performance. The non-linear least squares fitting technique would be more reliable provided that a higher number of probes are deployed.

  10. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  11. Analysis techniques for momentum transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, S.D.

    1991-08-01

    This report discusses the following topics on momentum analysis in tokamaks and stellarators: the momentum balance equation; deposition of torque by neutral beams; effects of toroidal rotation; and experimental observations. (LSP)

  12. Effectiveness of Various Methods of Teaching Proper Inhaler Technique.

    PubMed

    Axtell, Samantha; Haines, Seena; Fairclough, Jamie

    2017-04-01

    To compare the effectiveness of 4 different instructional interventions in training proper inhaler technique. Randomized, noncrossover trial. Health fair and indigent clinic. Inhaler-naive adult volunteers who spoke and read English. Subjects were assigned to complete the following: (1) read a metered dose inhaler (MDI) package insert pamphlet, (2) watch a Centers for Disease Control and Prevention (CDC) video demonstrating MDI technique, (3) watch a YouTube video demonstrating MDI technique, or (4) receive direct instruction of MDI technique from a pharmacist. Inhaler use competency (completion of all 7 prespecified critical steps). Of the 72 subjects, 21 (29.2%) demonstrated competent inhaler technique. A statistically significant difference between pharmacist direct instruction and the remaining interventions, both combined ( P < .0001) and individually ( P ≤ .03), was evident. No statistically significant difference was detected among the remaining 3 intervention groups. Critical steps most frequently omitted or improperly performed were exhaling before inhalation and holding of breath after inhalation. A 2-minute pharmacist counseling session is more effective than other interventions in successfully educating patients on proper inhaler technique. Pharmacists can play a pivotal role in reducing the implications of improper inhaler use.

  13. Regional environmental analysis and management: New techniques for current problems

    NASA Technical Reports Server (NTRS)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  14. Measured extent of agricultural expansion depends on analysis technique

    DOE PAGES

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.; ...

    2017-01-31

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  15. Measured extent of agricultural expansion depends on analysis technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Jennifer B.; Merz, Dylan; Copenhaver, Ken L.

    Concern is rising that ecologically important, carbon-rich natural lands in the United States are losing ground to agriculture. We investigate how quantitative assessments of historical land use change to address this concern differ in their conclusions depending on the data set used. We examined land use change between 2006 and 2014 in 20 counties in the Prairie Pothole Region using the Cropland Data Layer, a modified Cropland Data Layer, data from the National Agricultural Imagery Program, and in-person ground-truthing. The Cropland Data Layer analyses overwhelmingly returned the largest amount of land use change with associated error that limits drawing conclusionsmore » from it. Analysis with visual imagery estimated a fraction of this land use change. Clearly, analysis technique drives understanding of the measured extent of land use change; different techniques produce vastly different results that would inform land management policy in strikingly different ways. As a result, best practice guidelines are needed.« less

  16. Fourier transform infrared spectroscopy techniques for the analysis of drugs of abuse

    NASA Astrophysics Data System (ADS)

    Kalasinsky, Kathryn S.; Levine, Barry K.; Smith, Michael L.; Magluilo, Joseph J.; Schaefer, Teresa

    1994-01-01

    Cryogenic deposition techniques for Gas Chromatography/Fourier Transform Infrared (GC/FT-IR) can be successfully employed in urinalysis for drugs of abuse with detection limits comparable to those of the established Gas Chromatography/Mass Spectrometry (GC/MS) technique. The additional confidence of the data that infrared analysis can offer has been helpful in identifying ambiguous results, particularly, in the case of amphetamines where drugs of abuse can be confused with over-the-counter medications or naturally occurring amines. Hair analysis has been important in drug testing when adulteration of urine samples has been a question. Functional group mapping can further assist the analysis and track drug use versus time.

  17. Peptidomics: the integrated approach of MS, hyphenated techniques and bioinformatics for neuropeptide analysis.

    PubMed

    Boonen, Kurt; Landuyt, Bart; Baggerman, Geert; Husson, Steven J; Huybrechts, Jurgen; Schoofs, Liliane

    2008-02-01

    MS is currently one of the most important analytical techniques in biological and medical research. ESI and MALDI launched the field of MS into biology. The performance of mass spectrometers increased tremendously over the past decades. Other technological advances increased the analytical power of biological MS even more. First, the advent of the genome projects allowed an automated analysis of mass spectrometric data. Second, improved separation techniques, like nanoscale HPLC, are essential for MS analysis of biomolecules. The recent progress in bioinformatics is the third factor that accelerated the biochemical analysis of macromolecules. The first part of this review will introduce the basics of these techniques. The field that integrates all these techniques to identify endogenous peptides is called peptidomics and will be discussed in the last section. This integrated approach aims at identifying all the present peptides in a cell, organ or organism (the peptidome). Today, peptidomics is used by several fields of research. Special emphasis will be given to the identification of neuropeptides, a class of short proteins that fulfil several important intercellular signalling functions in every animal. MS imaging techniques and biomarker discovery will also be discussed briefly.

  18. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake: Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  19. Acoustic mode measurements in the inlet of a model turbofan using a continuously rotating rake - Data collection/analysis techniques

    NASA Technical Reports Server (NTRS)

    Hall, David G.; Heidelberg, Laurence; Konno, Kevin

    1993-01-01

    The rotating microphone measurement technique and data analysis procedures are documented which are used to determine circumferential and radial acoustic mode content in the inlet of the Advanced Ducted Propeller (ADP) model. Circumferential acoustic mode levels were measured at a series of radial locations using the Doppler frequency shift produced by a rotating inlet microphone probe. Radial mode content was then computed using a least squares curve fit with the measured radial distribution for each circumferential mode. The rotating microphone technique is superior to fixed-probe techniques because it results in minimal interference with the acoustic modes generated by rotor-stator interaction. This effort represents the first experimental implementation of a measuring technique developed by T. G. Sofrin. Testing was performed in the NASA Lewis Low Speed Anechoic Wind Tunnel at a simulated takeoff condition of Mach 0.2. The design is included of the data analysis software and the performance of the rotating rake apparatus. The effect of experiment errors is also discussed.

  20. Application of Petri net based analysis techniques to signal transduction pathways.

    PubMed

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-11-02

    Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically meaningful functional units. The

  1. Determination of minor and trace elements concentration in kidney stones using elemental analysis techniques

    NASA Astrophysics Data System (ADS)

    Srivastava, Anjali

    The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. The X-ray fluorescence (XRF) and neutron activation analysis (NAA) experiments were performed and different kidney stones were analyzed. The interactions of X-ray photons and neutrons with matter are complementary in nature, resulting in distinctly different materials detection. This is the first approach to utilize combined X-ray fluorescence and neutron activation analysis for a comprehensive analysis of the kideny stones. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. The use of open source program Python Multi-Channel Analyzer was utilized to unfold the XRF spectrum. A new type of experimental set-up was developed and utilized for XRF and NAA analysis of the kidney stone. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF and NAA technique. The elements which were identified from XRF technique are Br, Cu, Ga, Ge, Mo, Nb, Ni, Rb, Se, Sr, Y, Zr. And, by using Neutron Activation Analysis (NAA) are Au, Br, Ca, Er, Hg, I, K, Na, Pm, Sb, Sc, Sm, Tb, Yb, Zn. This thesis presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF and NAA instrumental activation analysis techniques.

  2. Techniques for the analysis of data from coded-mask X-ray telescopes

    NASA Technical Reports Server (NTRS)

    Skinner, G. K.; Ponman, T. J.; Hammersley, A. P.; Eyles, C. J.

    1987-01-01

    Several techniques useful in the analysis of data from coded-mask telescopes are presented. Methods of handling changes in the instrument pointing direction are reviewed and ways of using FFT techniques to do the deconvolution considered. Emphasis is on techniques for optimally-coded systems, but it is shown that the range of systems included in this class can be extended through the new concept of 'partial cycle averaging'.

  3. A Meta-Analysis of Hypnotherapeutic Techniques in the Treatment of PTSD Symptoms.

    PubMed

    O'Toole, Siobhan K; Solomon, Shelby L; Bergdahl, Stephen A

    2016-02-01

    The efficacy of hypnotherapeutic techniques as treatment for symptoms of posttraumatic stress disorder (PTSD) was explored through meta-analytic methods. Studies were selected through a search of 29 databases. Altogether, 81 studies discussing hypnotherapy and PTSD were reviewed for inclusion criteria. The outcomes of 6 studies representing 391 participants were analyzed using meta-analysis. Evaluation of effect sizes related to avoidance and intrusion, in addition to overall PTSD symptoms after hypnotherapy treatment, revealed that all studies showed that hypnotherapy had a positive effect on PTSD symptoms. The overall Cohen's d was large (-1.18) and statistically significant (p < .001). Effect sizes varied based on study quality; however, they were large and statistically significant. Using the classic fail-safe N to assess for publication bias, it was determined it would take 290 nonsignificant studies to nullify these findings. Copyright © 2016 International Society for Traumatic Stress Studies.

  4. A simple and inexpensive image analysis technique to study the effect of disintegrants concentration and diluents type on disintegration.

    PubMed

    Berardi, Alberto; Bisharat, Lorina; Blaibleh, Anaheed; Pavoni, Lucia; Cespi, Marco

    2018-06-20

    Tablets disintegration is often the result of a size expansion of the tablets. In this study, we quantified the extent and direction of size expansion of tablets during disintegration, using readily available techniques, i.e. a digital camera and a public domain image analysis software. After validating the method, the influence of disintegrants concentration and diluents type on kinetics and mechanisms of disintegration were studied. Tablets containing diluent, disintegrant (sodium starch glycolate-SSG, crospovidone-PVPP or croscarmellose sodium-CCS) and lubricant were prepared by direct compression. Projected area and aspect ratio of the tablets were monitored using image analysis techniques. The developed method could describe the kinetics and mechanisms of disintegration qualitatively and quantitatively. SSG and PVPP acted purely by swelling and shape recovery mechanisms. Instead, CCS worked by a combination of both mechanisms, the extent of which changed depending on its concentration and the diluent type. We anticipate that the method described here could provide a framework for the routine screening of tablets disintegration using readily available equipment. Copyright © 2018. Published by Elsevier Inc.

  5. V/STOL and STOL ground effects and testing techniques

    NASA Technical Reports Server (NTRS)

    Kuhn, R. E.

    1987-01-01

    The ground effects associated with V/STOL operation were examined and an effort was made to develop the equipment and testing techniques needed for that understanding. Primary emphasis was on future experimental programs in the 40 x 80 and the 80 x 120 foot test sections and in the outdoor static test stand associated with these facilities. The commonly used experimental techniques are reviewed and data obtained by various techniques are compared with each other and with available estimating methods. These reviews and comparisons provide insight into the limitations of past studies and the testing techniques used and identify areas where additional work is needed. The understanding of the flow mechanics involved in hovering and in transition in and out of ground effect is discussed. The basic flow fields associated with hovering, transition and STOL operation of jet powered V/STOL aircraft are depicted.

  6. An analysis of artificial viscosity effects on reacting flows using a spectral multi-domain technique

    NASA Technical Reports Server (NTRS)

    Macaraeg, M. G.; Streett, C. L.; Hussaini, M. Y.

    1987-01-01

    Standard techniques used to model chemically-reacting flows require an artificial viscosity for stability in the presence of strong shocks. The resulting shock is smeared over at least three computational cells, so that the thickness of the shock is dictated by the structure of the overall mesh and not the shock physics. A gas passing through a strong shock is thrown into a nonequilibrium state and subsequently relaxes down over some finite distance to an equilibrium end state. The artificial smearing of the shock envelops this relaxation zone which causes the chemical kinetics of the flow to be altered. A method is presented which can investigate these issues by following the chemical kinetics and flow kinetics of a gas passing through a fully resolved shock wave at hypersonic Mach numbers. A nonequilibrium chemistry model for air is incorporated into a spectral multidomain Navier-Stokes solution method. Since no artificial viscosity is needed for stability of the multidomain technique, the precise effect of this artifice on the chemical kinetics and relevant flow features can be determined.

  7. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  8. Decision Analysis Techniques for Adult Learners: Application to Leadership

    ERIC Educational Resources Information Center

    Toosi, Farah

    2017-01-01

    Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…

  9. Effects of synthesis techniques on chemical composition, microstructure and dielectric properties of Mg-doped calcium titanate

    NASA Astrophysics Data System (ADS)

    Jongprateep, Oratai; Sato, Nicha

    2018-04-01

    Calcium titanate (CaTiO3) has been recognized as a material for fabrication of dielectric components, owing to its moderate dielectric constant and excellent microwave response. Enhancement of dielectric properties of the material can be achieved through doping, compositional and microstructural control. This study, therefore, aimed at investigating effects of powder synthesis techniques on compositions, microstructure, and dielectric properties of Mg-doped CaTiO3. Solution combustion and solid-state reaction were powder synthesis techniques employed in preparation of undoped CaTiO3 and CaTiO3 doped with 5-20 at% Mg. Compositional analysis revealed that powder synthesis techniques did not exhibit a significant effect on formation of secondary phases. When Mg concentration did not exceed 5 at%, the powders prepared by both techniques contained only a single phase. An increase of MgO secondary phase was observed as Mg concentrations increased from 10 to 20 at%. Experimental results, on the contrary, revealed that powder synthesis techniques contributed to significant differences in microstructure. Solution combustion technique produced powders with finer particle sizes, which consequently led to finer grain sizes and density enhancement. High-density specimens with fine microstructure generally exhibit improved dielectric properties. Dielectric measurements revealed that dielectric constants of all samples ranged between 231 and 327 at 1 MHz, and that superior dielectric constants were observed in samples prepared by the solution combustion technique.

  10. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Are physical activity interventions for healthy inactive adults effective in promoting behavior change and maintenance, and which behavior change techniques are effective? A systematic review and meta-analysis.

    PubMed

    Howlett, Neil; Trivedi, Daksha; Troop, Nicholas A; Chater, Angel Marie

    2018-02-28

    Physical inactivity and sedentary behavior relate to poor health outcomes independently. Healthy inactive adults are a key target population for prevention. This systematic review and meta-analysis aimed to evaluate the effectiveness of physical activity and/or sedentary behavior interventions, measured postintervention (behavior change) and at follow-up (behavior change maintenance), to identify behavior change techniques (BCT) within, and report on fidelity. Included studies were randomized controlled trials, targeting healthy inactive adults, aiming to change physical activity and/or sedentary behavior, with a minimum postintervention follow-up of 6 months, using 16 databases from 1990. Two reviewers independently coded risk of bias, the "Template for Intervention Description and Replication" (TIDieR) checklist, and BCTs. Twenty-six studies were included; 16 pooled for meta-analysis. Physical activity interventions were effective at changing behavior (d = 0.32, 95% confidence intervals = 0.16-0.48, n = 2,346) and maintaining behavior change after 6 months or more (d = 0.21, 95% confidence intervals = 0.12-0.30, n = 2,190). Sedentary behavior interventions (n = 2) were not effective. At postintervention, physical activity intervention effectiveness was associated with the BCTs "Biofeedback," "Demonstration of the behavior," "Behavior practice/rehearsal," and "Graded tasks." At follow-up, effectiveness was associated with using "Action planning," "Instruction on how to perform the behavior," "Prompts/cues," "Behavior practice/rehearsal," "Graded tasks," and "Self-reward." Fidelity was only documented in one study. Good evidence was found for behavior change maintenance effects in healthy inactive adults, and underlying BCTs. This review provides translational evidence to improve research, intervention design, and service delivery in physical activity interventions, while highlighting the lack of fidelity measurement.

  12. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    PubMed Central

    Wang, Chuji; Sahay, Peeyush

    2009-01-01

    Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC) disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS), cavity ringdown spectroscopy (CRDS), integrated cavity output spectroscopy (ICOS), cavity enhanced absorption spectroscopy (CEAS), cavity leak-out spectroscopy (CALOS), photoacoustic spectroscopy (PAS), quartz-enhanced photoacoustic spectroscopy (QEPAS), and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS). Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis. PMID:22408503

  13. Techniques for Analysis of DSN 64-meter Antenna Azimuth Bearing Film Height Records

    NASA Technical Reports Server (NTRS)

    Stevens, R.; Quach, C. T.

    1983-01-01

    The DSN 64-m antennas use oil pad azimuth thrust bearings. Instrumentation on the bearing pads measures the height of the oil film between the pad and the bearing runner. Techniques to analyze the film height record are developed and discussed. The analysis techniques present the unwieldy data in a compact form for assessment of bearing condition. The techniques are illustrated by analysis of a small sample of film height records from each of the three 64-m antennas. The results show the general condition of the bearings of DSS 43 and DSS 63 as good to excellent, and a DSS 14 as marginal.

  14. Comparison of Spares Logistics Analysis Techniques for Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Owens, Andrew; de Weck, Olivier; Mattfeld, Bryan; Stromgren, Chel; Cirillo, William

    2015-01-01

    As the durations and distances involved in human exploration missions increase, the logistics associated with the repair and maintenance becomes more challenging. Whereas the operation of the International Space Station (ISS) depends upon regular resupply from the Earth, this paradigm may not be feasible for future missions. Longer mission durations result in higher probabilities of component failures as well as higher uncertainty regarding which components may fail, and longer distances from Earth increase the cost of resupply as well as the speed at which the crew can abort to Earth in the event of an emergency. As such, mission development efforts must take into account the logistics requirements associated with maintenance and spares. Accurate prediction of the spare parts demand for a given mission plan and how that demand changes as a result of changes to the system architecture enables full consideration of the lifecycle cost associated with different options. In this paper, we utilize a range of analysis techniques - Monte Carlo, semi-Markov, binomial, and heuristic - to examine the relationship between the mass of spares and probability of loss of function related to the Carbon Dioxide Removal System (CRS) for a notional, simplified mission profile. The Exploration Maintainability Analysis Tool (EMAT), developed at NASA Langley Research Center, is utilized for the Monte Carlo analysis. We discuss the implications of these results and the features and drawbacks of each method. In particular, we identify the limitations of heuristic methods for logistics analysis, and the additional insights provided by more in-depth techniques. We discuss the potential impact of system complexity on each technique, as well as their respective abilities to examine dynamic events. This work is the first step in an effort that will quantitatively examine how well these techniques handle increasingly more complex systems by gradually expanding the system boundary.

  15. Advanced analysis technique for the evaluation of linear alternators and linear motors

    NASA Technical Reports Server (NTRS)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  16. Early Detection of Severe Apnoea through Voice Analysis and Automatic Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández, Ruben; Blanco, Jose Luis; Díaz, David; Hernández, Luis A.; López, Eduardo; Alcázar, José

    This study is part of an on-going collaborative effort between the medical and the signal processing communities to promote research on applying voice analysis and Automatic Speaker Recognition techniques (ASR) for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based diagnosis could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we present and discuss the possibilities of using generative Gaussian Mixture Models (GMMs), generally used in ASR systems, to model distinctive apnoea voice characteristics (i.e. abnormal nasalization). Finally, we present experimental findings regarding the discriminative power of speaker recognition techniques applied to severe apnoea detection. We have achieved an 81.25 % correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  17. Aggregation factor analysis for protein formulation by a systematic approach using FTIR, SEC and design of experiments techniques.

    PubMed

    Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya

    2012-01-05

    A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Use of different spectroscopic techniques in the analysis of Roman age wall paintings.

    PubMed

    Agnoli, Francesca; Calliari, Irene; Mazzocchin, Gian-Antonio

    2007-01-01

    In this paper the analysis of samples of Roman age wall paintings coming from: Pordenone, Vicenza and Verona is carried out by using three different techniques: energy dispersive x-rays spectroscopy (EDS), x-rays fluorescence (XRF) and proton induced x-rays emission (PIXE). The features of the three spectroscopic techniques in the analysis of samples of archaeological interest are discussed. The studied pigments were: cinnabar, yellow ochre, green earth, Egyptian blue and carbon black.

  19. A multiple technique approach to the analysis of urinary calculi.

    PubMed

    Rodgers, A L; Nassimbeni, L R; Mulder, K J

    1982-01-01

    10 urinary calculi have been qualitatively and quantitatively analysed using X-ray diffraction, infra-red, scanning electron microscopy, X-ray fluorescence, atomic absorption and density gradient procedures. Constituents and compositional features which often go undetected due to limitations in the particular analytical procedure being used, have been identified and a detailed picture of each stone's composition and structure has been obtained. In all cases at least two components were detected suggesting that the multiple technique approach might cast some doubt as to the existence of "pure" stones. Evidence for a continuous, non-sequential deposition mechanism has been detected. In addition, the usefulness of each technique in the analysis of urinary stones has been assessed and the multiple technique approach has been evaluated as a whole.

  20. Creep-Rupture Data Analysis - Engineering Application of Regression Techniques. Ph.D. Thesis - North Carolina State Univ.

    NASA Technical Reports Server (NTRS)

    Rummler, D. R.

    1976-01-01

    The results are presented of investigations to apply regression techniques to the development of methodology for creep-rupture data analysis. Regression analysis techniques are applied to the explicit description of the creep behavior of materials for space shuttle thermal protection systems. A regression analysis technique is compared with five parametric methods for analyzing three simulated and twenty real data sets, and a computer program for the evaluation of creep-rupture data is presented.

  1. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis

    PubMed Central

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% –CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons

  2. Comparison of the Joel-Cohen-based technique and the transverse Pfannenstiel for caesarean section for safety and effectiveness: A systematic review and meta-analysis.

    PubMed

    Olyaeemanesh, Alireza; Bavandpour, Elahe; Mobinizadeh, Mohammadreza; Ashrafinia, Mansoor; Bavandpour, Maryam; Nouhi, Mojtaba

    2017-01-01

    Background: Caesarean section (C-section) is the most common surgery among women worldwide, and the global rate of this surgical procedure has been continuously rising. Hence, it is significantly crucial to develop and apply highly effective and safe caesarean section techniques. In this review study, we aimed at assessing the safety and effectiveness of the Joel-Cohen-based technique and comparing the results with the transverse Pfannenstiel incision for C-section. Methods: In this study, various reliable databases such as the PubMed Central, COCHRANE, DARE, and Ovid MEDLINE were targeted. Reviews, systematic reviews, and randomized clinical trial studies comparing the Joel-Cohen-based technique and the transverse Pfannenstiel incision were selected based on the inclusion criteria. Selected studies were checked by 2 independent reviewers based on the inclusion criteria, and the quality of these studies was assessed. Then, their data were extracted and analyzed. Results: Five randomized clinical trial studies met the inclusion criteria. According to the exiting evidence, statistical results of the Joel-Cohen-based technique showed that this technique is more effective compared to the transverse Pfannenstiel incision. Metaanalysis results of the 3 outcomes were as follow: operation time (5 trials, 764 women; WMD -9.78; 95% CI:-14.49-5.07 minutes, p<0.001), blood loss (3 trials, 309 women; WMD -53.23ml; 95% -CI: 90.20-16.26 ml, p= 0.004), and post-operative hospital stay (3 trials, 453 women; WMD -.69 day; 95% CI: 1.4-0.03 day, p<0.001). Statistical results revealed a significant difference between the 2 techniques. Conclusion: According to the literature, despite having a number of side effects, the Joel-Cohen-based technique is generally more effective than the Pfannenstiel incision technique. In addition, it was recommended that the Joel-Cohen-based technique be used as a replacement for the Pfannenstiel incision technique according to the surgeons' preferences

  3. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  4. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    PubMed

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  5. Application of Petri net based analysis techniques to signal transduction pathways

    PubMed Central

    Sackmann, Andrea; Heiner, Monika; Koch, Ina

    2006-01-01

    Background Signal transduction pathways are usually modelled using classical quantitative methods, which are based on ordinary differential equations (ODEs). However, some difficulties are inherent in this approach. On the one hand, the kinetic parameters involved are often unknown and have to be estimated. With increasing size and complexity of signal transduction pathways, the estimation of missing kinetic data is not possible. On the other hand, ODEs based models do not support any explicit insights into possible (signal-) flows within the network. Moreover, a huge amount of qualitative data is available due to high-throughput techniques. In order to get information on the systems behaviour, qualitative analysis techniques have been developed. Applications of the known qualitative analysis methods concern mainly metabolic networks. Petri net theory provides a variety of established analysis techniques, which are also applicable to signal transduction models. In this context special properties have to be considered and new dedicated techniques have to be designed. Methods We apply Petri net theory to model and analyse signal transduction pathways first qualitatively before continuing with quantitative analyses. This paper demonstrates how to build systematically a discrete model, which reflects provably the qualitative biological behaviour without any knowledge of kinetic parameters. The mating pheromone response pathway in Saccharomyces cerevisiae serves as case study. Results We propose an approach for model validation of signal transduction pathways based on the network structure only. For this purpose, we introduce the new notion of feasible t-invariants, which represent minimal self-contained subnets being active under a given input situation. Each of these subnets stands for a signal flow in the system. We define maximal common transition sets (MCT-sets), which can be used for t-invariant examination and net decomposition into smallest biologically

  6. The incidence of secondary vertebral fracture of vertebral augmentation techniques versus conservative treatment for painful osteoporotic vertebral fractures: a systematic review and meta-analysis.

    PubMed

    Song, Dawei; Meng, Bin; Gan, Minfeng; Niu, Junjie; Li, Shiyan; Chen, Hao; Yuan, Chenxi; Yang, Huilin

    2015-08-01

    Percutaneous vertebroplasty (PVP) and balloon kyphoplasty (BKP) are minimally invasive and effective vertebral augmentation techniques for managing osteoporotic vertebral compression fractures (OVCFs). Recent meta-analyses have compared the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques or conservative treatment; however, the inclusions were not thorough and rigorous enough, and the effects of each technique on the incidence of secondary vertebral fractures remain unclear. To perform an updated systematic review and meta-analysis of the studies with more rigorous inclusion criteria on the effects of vertebral augmentation techniques and conservative treatment for OVCF on the incidence of secondary vertebral fractures. PubMed, MEDLINE, EMBASE, SpringerLink, Web of Science, and the Cochrane Library database were searched for relevant original articles comparing the incidence of secondary vertebral fractures between vertebral augmentation techniques and conservative treatment for patients with OVCFs. Randomized controlled trials (RCTs) and prospective non-randomized controlled trials (NRCTs) were identified. The methodological qualities of the studies were evaluated, relevant data were extracted and recorded, and an appropriate meta-analysis was conducted. A total of 13 articles were included. The pooled results from included studies showed no statistically significant differences in the incidence of secondary vertebral fractures between patients treated with vertebral augmentation techniques and conservative treatment. Subgroup analysis comparing different study designs, durations of symptoms, follow-up times, races of patients, and techniques were conducted, and no significant differences in the incidence of secondary fractures were identified (P > 0.05). No obvious publication bias was detected by either Begg's test (P = 0.360 > 0.05) or Egger's test (P = 0.373 > 0.05). Despite current thinking in the

  7. Different techniques of multispectral data analysis for vegetation fraction retrieval

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  8. Measuring Response to Intervention: Comparing Three Effect Size Calculation Techniques for Single-Case Design Analysis

    ERIC Educational Resources Information Center

    Ross, Sarah Gwen

    2012-01-01

    Response to intervention (RTI) is increasingly being used in educational settings to make high-stakes, special education decisions. Because of this, the accurate use and analysis of single-case designs to monitor intervention effectiveness has become important to the RTI process. Effect size methods for single-case designs provide a useful way to…

  9. Techniques of lumbar-sacral spine fusion in spondylosis: systematic literature review and meta-analysis of randomized clinical trials.

    PubMed

    Umeta, Ricardo S G; Avanzi, Osmar

    2011-07-01

    Spine fusions can be performed through different techniques and are used to treat a number of vertebral pathologies. However, there seems to be no consensus regarding which technique of fusion is best suited to treat each distinct spinal disease or group of diseases. To study the effectiveness and complications of the different techniques used for spinal fusion in patients with lumbar spondylosis. Systematic literature review and meta-analysis. Randomized clinical studies comparing the most commonly performed surgical techniques for spine fusion in lumbar-sacral spondylosis, as well as those reporting patient outcome were selected. Identify which technique, if any, presents the best clinical, functional, and radiographic outcome. Systematic literature review and meta-analysis based on scientific articles published and indexed to the following databases: PubMed (1966-2009), Cochrane Collaboration-CENTRAL, EMBASE (1980-2009), and LILACS (1982-2009). The general search strategy focused on the surgical treatment of patients with lumbar-sacral spondylosis. Eight studies met the inclusion criteria and were selected with a total of 1,136 patients. Meta-analysis showed that patients who underwent interbody fusion presented a significantly smaller blood loss (p=.001) and a greater rate of bone fusion (p=.02). Patients submitted to fusion using the posterolateral approach had a significantly shorter operative time (p=.007) and less perioperative complications (p=.03). No statistically significant difference was found for the other studied variables (pain, functional impairment, and return to work). The most commonly used techniques for lumbar spine fusion in patients with spondylosis were interbody fusion and posterolateral approach. Both techniques were comparable in final outcome, but the former presented better rates of fusion and the latter the less complications. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Reduced-Smoke Solid Propellant Combustion Products Analysis. Development of a Micromotor Combustor Technique.

    DTIC Science & Technology

    1976-10-01

    A low-cost micromotor combustor technique has been devised to support the development of reduced-smoke solid propellant formulations. The technique...includes a simple, reusable micromotor capable of high chamber pressures, a combustion products collection system, and procedures for analysis of

  11. A Survey of Shape Parameterization Techniques

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1999-01-01

    This paper provides a survey of shape parameterization techniques for multidisciplinary optimization and highlights some emerging ideas. The survey focuses on the suitability of available techniques for complex configurations, with suitability criteria based on the efficiency, effectiveness, ease of implementation, and availability of analytical sensitivities for geometry and grids. The paper also contains a section on field grid regeneration, grid deformation, and sensitivity analysis techniques.

  12. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyer, Matthew W.

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include themore » inherently weak Raman cross section and susceptibility to fluorescence interference.« less

  13. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  14. Comparison of immediate effects between two medical stretching techniques on Hamstrings flexibility

    PubMed Central

    Aye, Thanda; Kuramoto-Ahuja, Tsugumi; Han, Heonsoo; Maruyama, Hitoshi

    2017-01-01

    [Purpose] The aim of this study was to compare immediate effects between new medical stretching (NMS) and conventional medical stretching (CMS) techniques on Hamstrings flexibility. [Subjects and Methods] Thirteen healthy adult males, with finger floor distance (FFD) less than zero centimeter, without known musculoskeletal and neurological impairment in spine or lower extremities, were included. The subjects were randomly allocated to two groups. The subjects were instructed to perform NMS and CMS (hold for 30 seconds once, twice for each side of lower extremity) for both sides (total two minutes, only one session for one day). The interval between the two techniques was one week. FFD was measured with digital standing trunk flexion meter at the pre-intervention and post-intervention of both techniques. [Results] The mean values of FFD improved at the post-interventions of both techniques. The tests of within subject effects indicated that the main effect of treatment was not significant but the main effect of time was significant and the interaction of treatment and time was also significant. [Conclusion] The results of this study indicated that both medical stretching techniques were effective on Hamstrings flexibility immediately after the intervention and NMS technique was more effective on improving flexibility. PMID:28931979

  15. Comparison of immediate effects between two medical stretching techniques on Hamstrings flexibility.

    PubMed

    Aye, Thanda; Kuramoto-Ahuja, Tsugumi; Han, Heonsoo; Maruyama, Hitoshi

    2017-09-01

    [Purpose] The aim of this study was to compare immediate effects between new medical stretching (NMS) and conventional medical stretching (CMS) techniques on Hamstrings flexibility. [Subjects and Methods] Thirteen healthy adult males, with finger floor distance (FFD) less than zero centimeter, without known musculoskeletal and neurological impairment in spine or lower extremities, were included. The subjects were randomly allocated to two groups. The subjects were instructed to perform NMS and CMS (hold for 30 seconds once, twice for each side of lower extremity) for both sides (total two minutes, only one session for one day). The interval between the two techniques was one week. FFD was measured with digital standing trunk flexion meter at the pre-intervention and post-intervention of both techniques. [Results] The mean values of FFD improved at the post-interventions of both techniques. The tests of within subject effects indicated that the main effect of treatment was not significant but the main effect of time was significant and the interaction of treatment and time was also significant. [Conclusion] The results of this study indicated that both medical stretching techniques were effective on Hamstrings flexibility immediately after the intervention and NMS technique was more effective on improving flexibility.

  16. Investigating the effects of PDC cutters geometry on ROP using the Taguchi technique

    NASA Astrophysics Data System (ADS)

    Jamaludin, A. A.; Mehat, N. M.; Kamaruddin, S.

    2017-10-01

    At times, the polycrystalline diamond compact (PDC) bit’s performance dropped and affects the rate of penetration (ROP). The objective of this project is to investigate the effect of PDC cutter geometry and optimize them. An intensive study in cutter geometry would further enhance the ROP performance. The relatively extended analysis was carried out and four significant geometry factors have been identified that directly improved the ROP. Cutter size, back rake angle, side rake angle and chamfer angle are the stated geometry factors. An appropriate optimization technique that effectively controls all influential geometry factors during cutters manufacturing is introduced and adopted in this project. By adopting L9 Taguchi OA, simulation experiment is conducted by using explicit dynamics finite element analysis. Through a structure Taguchi analysis, ANOVA confirms that the most significant geometry to improve ROP is cutter size (99.16% percentage contribution). The optimized cutter is expected to drill with high ROP that can reduce the rig time, which in its turn, may reduce the total drilling cost.

  17. Effects of Interpretation as a Counseling Technique

    ERIC Educational Resources Information Center

    Helner, Philip A.; Jessell, John C.

    1974-01-01

    This research was an inquiry into the effects of interpretation in counseling. The feelings of subjects toward interpretation were compared with their feelings toward the techniques of reflection, advice giving, and probing. The implications of the use of interpretation in counseling are discussed. (Author)

  18. Integration of geological remote-sensing techniques in subsurface analysis

    USGS Publications Warehouse

    Taranik, James V.; Trautwein, Charles M.

    1976-01-01

    Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.

  19. Analysis of filter tuning techniques for sequential orbit determination

    NASA Technical Reports Server (NTRS)

    Lee, T.; Yee, C.; Oza, D.

    1995-01-01

    This paper examines filter tuning techniques for a sequential orbit determination (OD) covariance analysis. Recently, there has been a renewed interest in sequential OD, primarily due to the successful flight qualification of the Tracking and Data Relay Satellite System (TDRSS) Onboard Navigation System (TONS) using Doppler data extracted onboard the Extreme Ultraviolet Explorer (EUVE) spacecraft. TONS computes highly accurate orbit solutions onboard the spacecraft in realtime using a sequential filter. As the result of the successful TONS-EUVE flight qualification experiment, the Earth Observing System (EOS) AM-1 Project has selected TONS as the prime navigation system. In addition, sequential OD methods can be used successfully for ground OD. Whether data are processed onboard or on the ground, a sequential OD procedure is generally favored over a batch technique when a realtime automated OD system is desired. Recently, OD covariance analyses were performed for the TONS-EUVE and TONS-EOS missions using the sequential processing options of the Orbit Determination Error Analysis System (ODEAS). ODEAS is the primary covariance analysis system used by the Goddard Space Flight Center (GSFC) Flight Dynamics Division (FDD). The results of these analyses revealed a high sensitivity of the OD solutions to the state process noise filter tuning parameters. The covariance analysis results show that the state estimate error contributions from measurement-related error sources, especially those due to the random noise and satellite-to-satellite ionospheric refraction correction errors, increase rapidly as the state process noise increases. These results prompted an in-depth investigation of the role of the filter tuning parameters in sequential OD covariance analysis. This paper analyzes how the spacecraft state estimate errors due to dynamic and measurement-related error sources are affected by the process noise level used. This information is then used to establish

  20. Recent development in mass spectrometry and its hyphenated techniques for the analysis of medicinal plants.

    PubMed

    Zhu, Ming-Zhi; Chen, Gui-Lin; Wu, Jian-Lin; Li, Na; Liu, Zhong-Hua; Guo, Ming-Quan

    2018-04-23

    Medicinal plants are gaining increasing attention worldwide due to their empirical therapeutic efficacy and being a huge natural compound pool for new drug discovery and development. The efficacy, safety and quality of medicinal plants are the main concerns, which are highly dependent on the comprehensive analysis of chemical components in the medicinal plants. With the advances in mass spectrometry (MS) techniques, comprehensive analysis and fast identification of complex phytochemical components have become feasible, and may meet the needs, for the analysis of medicinal plants. Our aim is to provide an overview on the latest developments in MS and its hyphenated technique and their applications for the comprehensive analysis of medicinal plants. Application of various MS and its hyphenated techniques for the analysis of medicinal plants, including but not limited to one-dimensional chromatography, multiple-dimensional chromatography coupled to MS, ambient ionisation MS, and mass spectral database, have been reviewed and compared in this work. Recent advancs in MS and its hyphenated techniques have made MS one of the most powerful tools for the analysis of complex extracts from medicinal plants due to its excellent separation and identification ability, high sensitivity and resolution, and wide detection dynamic range. To achieve high-throughput or multi-dimensional analysis of medicinal plants, the state-of-the-art MS and its hyphenated techniques have played, and will continue to play a great role in being the major platform for their further research in order to obtain insight into both their empirical therapeutic efficacy and quality control. Copyright © 2018 John Wiley & Sons, Ltd.

  1. The Efficacy of Movement Representation Techniques for Treatment of Limb Pain--A Systematic Review and Meta-Analysis.

    PubMed

    Thieme, Holm; Morkisch, Nadine; Rietz, Christian; Dohle, Christian; Borgetto, Bernhard

    2016-02-01

    Relatively new evidence suggests that movement representation techniques (ie, therapies that use the observation and/or imagination of normal pain-free movements, such as mirror therapy, motor imagery, or movement and/or action observation) might be effective in reduction of some types of limb pain. To summarize the evidence regarding the efficacy of those techniques, a systematic review with meta-analysis was performed. We searched Cochrane Central Register of Controlled Trials, MEDLINE, EMBASE, CINAHL, AMED, PsychINFO, Physiotherapy Evidence Database, and OT-seeker up to August 2014 and hand-searched further relevant resources for randomized controlled trials that studied the efficacy of movement representation techniques in reduction of limb pain. The outcomes of interest were pain, disability, and quality of life. Study selection and data extraction were performed by 2 reviewers independently. We included 15 trials on the effects of mirror therapy, (graded) motor imagery, and action observation in patients with complex regional pain syndrome, phantom limb pain, poststroke pain, and nonpathological (acute) pain. Overall, movement representation techniques were found to be effective in reduction of pain (standardized mean difference [SMD] = -.82, 95% confidence interval [CI], -1.32 to -.31, P = .001) and disability (SMD = .72, 95% CI, .22-1.22, P = .004) and showed a positive but nonsignificant effect on quality of life (SMD = 2.61, 85% CI, -3.32 to 8.54, P = .39). Especially mirror therapy and graded motor imagery should be considered for the treatment of patients with complex regional pain syndrome. Furthermore, the results indicate that motor imagery could be considered as a potential effective treatment in patients with acute pain after trauma and surgery. To date, there is no evidence for a pain reducing effect of movement representation techniques in patients with phantom limb pain and poststroke pain other than complex regional pain syndrome

  2. Flame analysis using image processing techniques

    NASA Astrophysics Data System (ADS)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  3. The effect of mark enhancement techniques on the subsequent detection of saliva.

    PubMed

    McAllister, Patricia; Graham, Eleanor; Deacon, Paul; Farrugia, Kevin J

    2016-09-01

    There appears to be a limited but growing body of research on the sequential analysis/treatment of multiple types of evidence. The development of an integrated forensic approach is necessary to maximise evidence recovery and to ensure that a particular treatment is not detrimental to other types of evidence. This study aims to assess the effect of latent and blood mark enhancement techniques (e.g. fluorescence, ninhydrin, acid violet 17, black iron-oxide powder suspension) on the subsequent detection of saliva. Saliva detection was performed by means of a presumptive test (Phadebas®) in addition to analysis by a rapid stain identification (RSID) kit test and confirmatory DNA testing. Additional variables included a saliva depletion series and a number of different substrates with varying porosities as well as different ageing periods. Examination and photography under white light and fluorescence was carried out prior to and after chemical enhancement. All enhancement techniques (except Bluestar® Forensic Magnum luminol) employed in this study resulted in an improved visualisation of the saliva stains, although the inherent fluorescence of saliva was sometimes blocked after chemical treatment. The use of protein stains was, in general, detrimental to the detection of saliva. Positive results were less pronounced after the use of black iron-oxide powder suspension, cyanoacrylate fuming followed by BY40 and ninhydrin when compared to the respective positive controls. The application of Bluestar® Forensic Magnum luminol and black magnetic powder proved to be the least detrimental, with no significant difference between the test results and the positive controls. The use of non-destructive fluorescence examination provided good visualisation; however, only the first few marks in the depletion were observed. Of the samples selected for DNA analysis only depletion 1 samples contained sufficient DNA quantity for further processing using standard methodology. The 28-day

  4. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  5. Effective techniques for changing physical activity and healthy eating intentions and behaviour: A systematic review and meta-analysis.

    PubMed

    McDermott, Máirtín S; Oliver, Madalyn; Iverson, Don; Sharma, Rajeev

    2016-11-01

    The primary aim of this study was to review the evidence on the impact of a change in intention on behaviour and to identify (1) behaviour change techniques (BCTs) associated with changes in intention and (2) whether the same BCTs are also associated with changes in behaviour. A systematic review was conducted to identify interventions that produced a significant change in intention and assessed the impact of this change on behaviour at a subsequent time point. Each intervention was coded using a taxonomy of BCTs targeting healthy eating and physical activity. A series of meta-regression analyses were conducted to identify effective BCTs. In total, 25 reports were included. Interventions had a medium-to-large effect on intentions (d +  = 0.64) and a small-to-medium effect (d +  = 0.41) on behaviour. One BCT, 'provide information on the consequences of behaviour in general', was significantly associated with a positive change in intention. One BCT, 'relapse prevention/coping planning', was associated with a negative change in intention. No BCTs were found to have significant positive effects on behaviour. However, one BCT, 'provide feedback on performance', was found to have a significant negative effect. BCTs aligned with social cognitive theory were found to have significantly greater positive effects on intention (d +  = 0.83 vs. 0.56, p < .05), but not behaviour (d +  = 0.35 vs. 0.23, ns), than those aligned with the theory of planned behaviour. Although the included studies support the notion that a change in intention is associated with a change in behaviour, this review failed to produce evidence on how to facilitate behaviour change through a change in intention. Larger meta-analyses incorporating interventions targeting a broader range of behaviours may be warranted. Statement of contribution What is already known on this subject? Prior research on the causal relationship between intention and behaviour has produced mixed findings. Further

  6. A comparison of autonomous techniques for multispectral image analysis and classification

    NASA Astrophysics Data System (ADS)

    Valdiviezo-N., Juan C.; Urcid, Gonzalo; Toxqui-Quitl, Carina; Padilla-Vivanco, Alfonso

    2012-10-01

    Multispectral imaging has given place to important applications related to classification and identification of objects from a scene. Because of multispectral instruments can be used to estimate the reflectance of materials in the scene, these techniques constitute fundamental tools for materials analysis and quality control. During the last years, a variety of algorithms has been developed to work with multispectral data, whose main purpose has been to perform the correct classification of the objects in the scene. The present study introduces a brief review of some classical as well as a novel technique that have been used for such purposes. The use of principal component analysis and K-means clustering techniques as important classification algorithms is here discussed. Moreover, a recent method based on the min-W and max-M lattice auto-associative memories, that was proposed for endmember determination in hyperspectral imagery, is introduced as a classification method. Besides a discussion of their mathematical foundation, we emphasize their main characteristics and the results achieved for two exemplar images conformed by objects similar in appearance, but spectrally different. The classification results state that the first components computed from principal component analysis can be used to highlight areas with different spectral characteristics. In addition, the use of lattice auto-associative memories provides good results for materials classification even in the cases where some spectral similarities appears in their spectral responses.

  7. A web-based overview, systematic review and meta-analysis of pancreatic anastomosis techniques following pancreatoduodenectomy.

    PubMed

    Daamen, Lois A; Smits, F Jasmijn; Besselink, Marc G; Busch, Olivier R; Borel Rinkes, Inne H; van Santvoort, Hjalmar C; Molenaar, I Quintus

    2018-05-14

    Many pancreatic anastomoses have been proposed to reduce the incidence of postoperative pancreatic fistula (POPF) after pancreatoduodenectomy, but a complete overview is lacking. This systematic review and meta-analysis aims to provide an online overview of all pancreatic anastomosis techniques and to evaluate the incidence of clinically relevant POPF in randomized controlled trials (RCTs). A literature search was performed to December 2017. Included were studies giving a detailed description of the pancreatic anastomosis after open pancreatoduodenectomy and RCTs comparing techniques for the incidence of POPF (International Study Group of Pancreatic Surgery [ISGPS] Grade B/C). Meta-analyses were performed using a random-effects model. A total of 61 different anastomoses were found and summarized in 19 subgroups (www.pancreatic-anastomosis.com). In 6 RCTs, the POPF rate was 12% after pancreaticogastrostomy (n = 69/555) versus 20% after pancreaticojejunostomy (n = 106/531) (RR0.59; 95%CI 0.35-1.01, P = 0.05). Six RCTs comparing subtypes of pancreaticojejunostomy showed a pooled POPF rate of 10% (n = 109/1057). Duct-to-mucosa and invagination pancreaticojejunostomy showed similar results, respectively 14% (n = 39/278) versus 10% (n = 27/278) (RR1.40, 95%CI 0.47-4.15, P = 0.54). The proposed online overview can be used as an interactive platform, for uniformity in reporting anastomotic techniques and for educational purposes. The meta-analysis showed no significant difference in POPF rate between pancreatic anastomosis techniques. Copyright © 2018 International Hepato-Pancreato-Biliary Association Inc. Published by Elsevier Ltd. All rights reserved.

  8. Technique Feature Analysis or Involvement Load Hypothesis: Estimating Their Predictive Power in Vocabulary Learning.

    PubMed

    Gohar, Manoochehr Jafari; Rahmanian, Mahboubeh; Soleimani, Hassan

    2018-02-05

    Vocabulary learning has always been a great concern and has attracted the attention of many researchers. Among the vocabulary learning hypotheses, involvement load hypothesis and technique feature analysis have been proposed which attempt to bring some concepts like noticing, motivation, and generation into focus. In the current study, 90 high proficiency EFL students were assigned into three vocabulary tasks of sentence making, composition, and reading comprehension in order to examine the power of involvement load hypothesis and technique feature analysis frameworks in predicting vocabulary learning. It was unraveled that involvement load hypothesis cannot be a good predictor, and technique feature analysis was a good predictor in pretest to posttest score change and not in during-task activity. The implications of the results will be discussed in the light of preparing vocabulary tasks.

  9. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    PubMed

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  10. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  11. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    PubMed

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  12. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  13. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  14. Enhanced Analysis Techniques for an Imaging Neutron and Gamma Ray Spectrometer

    NASA Astrophysics Data System (ADS)

    Madden, Amanda C.

    The presence of gamma rays and neutrons is a strong indicator of the presence of Special Nuclear Material (SNM). The imaging Neutron and gamma ray SPECTrometer (NSPECT) developed by the University of New Hampshire and Michigan Aerospace corporation detects the fast neutrons and prompt gamma rays from fissile material, and the gamma rays from radioactive material. The instrument operates as a double scatter device, requiring a neutron or a gamma ray to interact twice in the instrument. While this detection requirement decreases the efficiency of the instrument, it offers superior background rejection and the ability to measure the energy and momentum of the incident particle. These measurements create energy spectra and images of the emitting source for source identification and localization. The dual species instrument provides superior detection than a single species alone. In realistic detection scenarios, few particles are detected from a potential threat due to source shielding, detection at a distance, high background, and weak sources. This contributes to a small signal to noise ratio, and threat detection becomes difficult. To address these difficulties, several enhanced data analysis tools were developed. A Receiver Operating Characteristic Curve (ROC) helps set instrumental alarm thresholds as well as to identify the presence of a source. Analysis of a dual-species ROC curve provides superior detection capabilities. Bayesian analysis helps to detect and identify the presence of a source through model comparisons, and helps create a background corrected count spectra for enhanced spectroscopy. Development of an instrument response using simulations and numerical analyses will help perform spectra and image deconvolution. This thesis will outline the principles of operation of the NSPECT instrument using the double scatter technology, traditional analysis techniques, and enhanced analysis techniques as applied to data from the NSPECT instrument, and an

  15. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  16. An analysis of short pulse and dual frequency radar techniques for measuring ocean wave spectra from satellites

    NASA Technical Reports Server (NTRS)

    Jackson, F. C.

    1980-01-01

    Scanning beam microwave radars were used to measure ocean wave directional spectra from satellites. In principle, surface wave spectral resolution in wave number can be obtained using either short pulse (SP) or dual frequency (DF) techniques; in either case, directional resolution obtains naturally as a consequence of a Bragg-like wave front matching. A four frequency moment characterization of backscatter from the near vertical using physical optics in the high frequency limit was applied to an analysis of the SP and DF measurement techniques. The intrinsic electromagnetic modulation spectrum was to the first order in wave steepness proportional to the large wave directional slope spectrum. Harmonic distortion was small and was a minimum near 10 deg incidence. NonGaussian wave statistics can have an effect comparable to that in the second order of scattering from a normally distributed sea surface. The SP technique is superior to the DF technique in terms of measurement signal to noise ratio and contrast ratio.

  17. An in Situ Technique for Elemental Analysis of Lunar Surfaces

    NASA Technical Reports Server (NTRS)

    Kane, K. Y.; Cremers, D. A.

    1992-01-01

    An in situ analytical technique that can remotely determine the elemental constituents of solids has been demonstrated. Laser-Induced Breakdown Spectroscopy (LIBS) is a form of atomic emission spectroscopy in which a powerful laser pulse is focused on a solid to generate a laser spark, or microplasma. Material in the plasma is vaporized, and the resulting atoms are excited to emit light. The light is spectrally resolved to identify the emitting species. LIBS is a simple technique that can be automated for inclusion aboard a remotely operated vehicle. Since only optical access to a sample is required, areas inaccessible to a rover can be analyzed remotely. A single laser spark both vaporizes and excites the sample so that near real-time analysis (a few minutes) is possible. This technique provides simultaneous multielement detection and has good sensitivity for many elements. LIBS also eliminates the need for sample retrieval and preparation preventing possible sample contamination. These qualities make the LIBS technique uniquely suited for use in the lunar environment.

  18. The evaluation of meta-analysis techniques for quantifying prescribed fire effects on fuel loadings.

    Treesearch

    Karen E. Kopper; Donald McKenzie; David L. Peterson

    2009-01-01

    Models and effect-size metrics for meta-analysis were compared in four separate meta-analyses quantifying surface fuels after prescribed fires in ponderosa pine (Pinus ponderosa Dougl. ex Laws.) forests of the Western United States. An aggregated data set was compiled from eight published reports that contained data from 65 fire treatment units....

  19. A novel pulse height analysis technique for nuclear spectroscopic and imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, H. H.; Wang, C. Y.; Chou, H. P.

    2005-08-01

    The proposed pulse height analysis technique is based on the constant and linear relationship between pulse width and pulse height generated from front-end electronics of nuclear spectroscopic and imaging systems. The present technique has successfully implemented into the sump water radiation monitoring system in a nuclear power plant. The radiation monitoring system uses a NaI(Tl) scintillator to detect radioactive nuclides of Radon daughters brought down by rain. The technique is also used for a nuclear medical imaging system. The system uses a position sensitive photomultiplier tube coupled with a scintillator. The proposed techniques has greatly simplified the electronic design and made the system a feasible one for potable applications.

  20. Performance of dental impression materials: Benchmarking of materials and techniques by three-dimensional analysis.

    PubMed

    Rudolph, Heike; Graf, Michael R S; Kuhn, Katharina; Rupf-Köhler, Stephanie; Eirich, Alfred; Edelmann, Cornelia; Quaas, Sebastian; Luthardt, Ralph G

    2015-01-01

    Among other factors, the precision of dental impressions is an important and determining factor for the fit of dental restorations. The aim of this study was to examine the three-dimensional (3D) precision of gypsum dies made using a range of impression techniques and materials. Ten impressions of a steel canine were fabricated for each of the 24 material-method-combinations and poured with type 4 die stone. The dies were optically digitized, aligned to the CAD model of the steel canine, and 3D differences were calculated. The results were statistically analyzed using one-way analysis of variance. Depending on material and impression technique, the mean values had a range between +10.9/-10.0 µm (SD 2.8/2.3) and +16.5/-23.5 µm (SD 11.8/18.8). Qualitative analysis using colorcoded graphs showed a characteristic location of deviations for different impression techniques. Three-dimensional analysis provided a comprehensive picture of the achievable precision. Processing aspects and impression technique were of significant influence.

  1. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    NASA Technical Reports Server (NTRS)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  2. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    NASA Astrophysics Data System (ADS)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  3. Analysis of Complex Intervention Effects in Time-Series Experiments.

    ERIC Educational Resources Information Center

    Bower, Cathleen

    An iterative least squares procedure for analyzing the effect of various kinds of intervention in time-series data is described. There are numerous applications of this design in economics, education, and psychology, although until recently, no appropriate analysis techniques had been developed to deal with the model adequately. This paper…

  4. Association of behaviour change techniques with effectiveness of dietary interventions among adults of retirement age: a systematic review and meta-analysis of randomised controlled trials.

    PubMed

    Lara, Jose; Evans, Elizabeth H; O'Brien, Nicola; Moynihan, Paula J; Meyer, Thomas D; Adamson, Ashley J; Errington, Linda; Sniehotta, Falko F; White, Martin; Mathers, John C

    2014-10-07

    There is a need for development of more effective interventions to achieve healthy eating, enhance healthy ageing, and to reduce the risk of age-related diseases. The aim of this study was to identify the behaviour change techniques (BCTs) used in complex dietary behaviour change interventions and to explore the association between BCTs utilised and intervention effectiveness. We undertook a secondary analysis of data from a previous systematic review with meta-analysis of the effectiveness of dietary interventions among people of retirement age. BCTs were identified using the reliable CALO-RE taxonomy in studies reporting fruit and vegetable (F and V) consumption as outcomes. The mean difference in F and V intake between active and control arms was compared between studies in which the BCTs were identified versus those not using the BCTs. Random-effects meta-regression models were used to assess the association of interventions BCTs with F and V intakes. Twenty-eight of the 40 BCTs listed in the CALO-RE taxonomy were identified in the 22 papers reviewed. Studies using the techniques 'barrier identification/problem solving' (93 g, 95% confidence interval (CI) 48 to 137 greater F and V intake), 'plan social support/social change' (78 g, 95%CI 24 to 132 greater F and V intake), 'goal setting (outcome)' (55 g 95%CI 7 to 103 greater F and V intake), 'use of follow-up prompts' (66 g, 95%CI 10 to 123 greater F and V intake) and 'provide feedback on performance' (39 g, 95%CI -2 to 81 greater F and V intake) were associated with greater effects of interventions on F and V consumption compared with studies not using these BCTs. The number of BCTs per study ranged from 2 to 16 (median = 6). Meta-regression showed that one additional BCT led to 8.3 g (95%CI 0.006 to 16.6 g) increase in F and V intake. Overall, this study has identified BCTs associated with effectiveness suggesting that these might be active ingredients of dietary interventions which will be effective in

  5. Effective techniques for the identification and accommodation of disturbances

    NASA Technical Reports Server (NTRS)

    Johnson, C. D.

    1989-01-01

    The successful control of dynamic systems such as space stations, or launch vehicles, requires a controller design methodology that acknowledges and addresses the disruptive effects caused by external and internal disturbances that inevitably act on such systems. These disturbances, technically defined as uncontrollable inputs, typically vary with time in an uncertain manner and usually cannot be directly measured in real time. A relatively new non-statistical technique for modeling, and (on-line) identification, of those complex uncertain disturbances that are not as erratic and capricious as random noise is described. This technique applies to multi-input cases and to many of the practical disturbances associated with the control of space stations, or launch vehicles. Then, a collection of smart controller design techniques that allow controlled dynamic systems, with possible multi-input controls, to accommodate (cope with) such disturbances with extraordinary effectiveness are associated. These new smart controllers are designed by non-statistical techniques and typically turn out to be unconventional forms of dynamic linear controllers (compensators) with constant coefficients. The simplicity and reliability of linear, constant coefficient controllers is well-known in the aerospace field.

  6. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  7. SEM Analysis Techniques for LSI Microcircuits. Volume 2

    DTIC Science & Technology

    1980-08-01

    4~, 1 v’ ’ RADC-TR80-250, Vol 11 (of two), Final Technical -Report, Augut1980 SEM, ANALYSIS TECHNIQUES, FOR LSI MICROCIRCUITS: ’Martin...Bit Static ’RAM.. Volume II - 1024 Bit Stat’i RAM, 4096 Bit Dynamic RAM (SiGATE WOS,)., 4096 Bit -Dynamic RAM ( 1 2 L Bipolar)., ,Summary. RADC-TR-80-250...States, ithout.irst obtani an export nse, is a violation t Internatio 1 Tr ffic in A . eguiations. Such violation is subject o penalty of to 2 years impr

  8. Advanced analysis techniques for uranium assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.

    2001-01-01

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less

  9. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    NASA Technical Reports Server (NTRS)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  10. The Review of Nuclear Microscopy Techniques: An Approach for Nondestructive Trace Elemental Analysis and Mapping of Biological Materials.

    PubMed

    Mulware, Stephen Juma

    2015-01-01

    The properties of many biological materials often depend on the spatial distribution and concentration of the trace elements present in a matrix. Scientists have over the years tried various techniques including classical physical and chemical analyzing techniques each with relative level of accuracy. However, with the development of spatially sensitive submicron beams, the nuclear microprobe techniques using focused proton beams for the elemental analysis of biological materials have yielded significant success. In this paper, the basic principles of the commonly used microprobe techniques of STIM, RBS, and PIXE for trace elemental analysis are discussed. The details for sample preparation, the detection, and data collection and analysis are discussed. Finally, an application of the techniques to analysis of corn roots for elemental distribution and concentration is presented.

  11. The effectiveness of the bone bridge transtibial amputation technique: A systematic review of high-quality evidence.

    PubMed

    Kahle, Jason T; Highsmith, M Jason; Kenney, John; Ruth, Tim; Lunseth, Paul A; Ertl, Janos

    2017-06-01

    This literature review was undertaken to determine if commonly held views about the benefits of a bone bridge technique are supported by the literature. Four databases were searched for articles pertaining to surgical strategies specific to a bone bridge technique of the transtibial amputee. A total of 35 articles were identified as potential articles. Authors included methodology that was applied to separate topics. Following identification, articles were excluded if they were determined to be low quality evidence or not pertinent. Nine articles were identified to be pertinent to one of the topics: Perioperative Care, Acute Care, Subjective Analysis and Function. Two articles sorted into multiple topics. Two articles were sorted into the Perioperative Care topic, 4 articles sorted into the Acute Care topic, 2 articles into the Subjective Analysis topic and 5 articles into the Function topic. There are no high quality (level one or two) clinical trials reporting comparisons of the bone bridge technique to traditional methods. There is limited evidence supporting the clinical outcomes of the bone bridge technique. There is no agreement supporting or discouraging the perioperative and acute care aspects of the bone bridge technique. There is no evidence defining an interventional comparison of the bone bridge technique. Current level III evidence supports a bone bridge technique as an equivalent option to the non-bone bridge transtibial amputation technique. Formal level I and II clinical trials will need to be considered in the future to guide clinical practice. Clinical relevance Clinical Practice Guidelines are evidence based. This systematic literature review identifies the highest quality evidence to date which reports a consensus of outcomes agreeing bone bridge is as safe and effective as alternatives. The clinical relevance is understanding bone bridge could additionally provide a mechanistic advantage for the transtibial amputee.

  12. The Shock and Vibration Bulletin. Part 3. Dynamic Analysis, Design Techniques

    DTIC Science & Technology

    1980-09-01

    response at certain discrete frequen- nique for dynamic analysis was pioneered by cies, not over a random-frequence spectrum. Myklestad[l]. Later Pestel and...34Fundamentals of Vibra- v’ angle of rotation due to tion Analysis ," McGraw-Hill, New York, 1956. bending 2. E.C. Pestel and F.A. Leckie, "Matrix o’ angle of...Bulletin 50IC FILE COPY (Part 03ofP,) to THE SHOCK AND VIBRATION BULLETIN Part 3 Dynamic Analysis , Design Techniques IELECTE SEPTEMBER 1980 S NOV 1

  13. An iterative forward analysis technique to determine the equation of state of dynamically compressed materials

    DOE PAGES

    Ali, S. J.; Kraus, R. G.; Fratanduono, D. E.; ...

    2017-05-18

    Here, we developed an iterative forward analysis (IFA) technique with the ability to use hydrocode simulations as a fitting function for analysis of dynamic compression experiments. The IFA method optimizes over parameterized quantities in the hydrocode simulations, breaking the degeneracy of contributions to the measured material response. Velocity profiles from synthetic data generated using a hydrocode simulation are analyzed as a first-order validation of the technique. We also analyze multiple magnetically driven ramp compression experiments on copper and compare with more conventional techniques. Excellent agreement is obtained in both cases.

  14. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    NASA Astrophysics Data System (ADS)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  15. Meta-Analysis of Interactive Video Instruction: A 10 Year Review of Achievement Effects.

    ERIC Educational Resources Information Center

    McNeil, Barbara J.; Nelson, Karyn R.

    Sixty-three studies which investigated cognitive achievement effects following interactive video (IV) instruction were integrated through a meta-analysis technique. Overall, mean achievement effect for IV was .530 (corrected for outliers), indicating that IV is an effective form of instruction. The effect is similar to that of computer-assisted…

  16. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    NASA Astrophysics Data System (ADS)

    Dahing, Lahasen@Normanshah; Yahya, Redzuan; Yahya, Roslan; Hassan, Hearie

    2014-09-01

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm3 and 15×15×15 cm3 were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  17. TECHNIQUES FOR EFFECTIVE TEACHING.

    ERIC Educational Resources Information Center

    HASTINGS, GERALDINE; AND OTHERS

    A COMPENDIUM OF WORKABLE AND REASONABLE TECHNIQUES TO PROVIDE TEACHERS WITH ALTERNATIVES IN SELECTING LEARNING EXPERIENCES IS PRESENTED. MATERIALS ARE DESIGNED TO AID TEACHERS AND LEARNERS IN ALL SUBJECT MATTER AREAS. TEACHING TECHNIQUES DESCRIBED ARE (1) THE CASE STUDY, (2) DISCUSSIONS SUCH AS SYMPOSIUM, COLLOQUIUM, BUZZ SESSIONS, AND…

  18. Determining the effects of routine fingermark detection techniques on the subsequent recovery and analysis of explosive residues on various substrates.

    PubMed

    King, Sam; Benson, Sarah; Kelly, Tamsin; Lennard, Chris

    2013-12-10

    An offender who has recently handled bulk explosives would be expected to deposit latent fingermarks that are contaminated with explosive residues. However, fingermark detection techniques need to be applied in order for these fingermarks to be detected and recorded. Little information is available in terms of how routine fingermark detection methods impact on the subsequent recovery and analysis of any explosive residues that may be present. If an identifiable fingermark is obtained and that fingermark is found to be contaminated with a particular explosive then that may be crucial evidence in a criminal investigation (including acts of terrorism involving improvised explosive devices). The principal aims of this project were to investigate: (i) the typical quantities of explosive material deposited in fingermarks by someone who has recently handled bulk explosives; and (ii) the effects of routine fingermark detection methods on the subsequent recovery and analysis of explosive residues in such fingermarks. Four common substrates were studied: paper, glass, plastic (polyethylene plastic bags), and metal (aluminium foil). The target explosive compounds were 2,4,6-trinitrotoluene (TNT), pentaerythritol tetranitrate (PETN), and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX), as well as chlorate and nitrate ions. Recommendations are provided in terms of the application of fingermark detection methods on surfaces that may contain explosive residues. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Terahertz imaging systems: a non-invasive technique for the analysis of paintings

    NASA Astrophysics Data System (ADS)

    Fukunaga, K.; Hosako, I.; Duling, I. N., III; Picollo, M.

    2009-07-01

    Terahertz (THz) imaging is an emerging technique for non-invasive analysis. Since THz waves can penetrate opaque materials, various imaging systems that use THz waves have been developed to detect, for instance, concealed weapons, illegal drugs, and defects in polymer products. The absorption of THz waves by water is extremely strong, and hence, THz waves can be used to monitor the water content in various objects. THz imaging can be performed either by transmission or by reflection of THz waves. In particular, time domain reflection imaging uses THz pulses that propagate in specimens, and in this technique, pulses reflected from the surface and from the internal boundaries of the specimen are detected. In general, the internal structure is observed in crosssectional images obtained using micro-specimens taken from the work that is being analysed. On the other hand, in THz time-domain imaging, a map of the layer of interest can be easily obtained without collecting any samples. When realtime imaging is required, for example, in the investigation of the effect of a solvent or during the monitoring of water content, a THz camera can be used. The first application of THz time-domain imaging in the analysis of a historical tempera masterpiece was performed on the panel painting Polittico di Badia by Giotto, of the permanent collection of the Uffizi Gallery. The results of that analysis revealed that the work is composed of two layers of gypsum, with a canvas between these layers. In the paint layer, gold foils covered by paint were clearly observed, and the consumption or ageing of gold could be estimated by noting the amount of reflection. These results prove that THz imaging can yield useful information for conservation and restoration purposes.

  20. A technique for conducting point pattern analysis of cluster plot stem-maps

    Treesearch

    C.W. Woodall; J.M. Graham

    2004-01-01

    Point pattern analysis of forest inventory stem-maps may aid interpretation and inventory estimation of forest attributes. To evaluate the techniques and benefits of conducting point pattern analysis of forest inventory stem-maps, Ripley`s K(t) was calculated for simulated tree spatial distributions and for over 600 USDA Forest Service Forest...

  1. A Meta-Analysis of the Effects of Desegregation on Academic Achievement.

    ERIC Educational Resources Information Center

    Krol, Ronald A.

    Using meta analysis techniques from a 1977 article by G.V. Glass, this study sought to determine the effects of desegregation on academic achievement when students were grouped by academic subject, grade level, and length of desegregation. Data were obtained from 71 studies (conducted between 1955 and 1977) concerned with the effects of…

  2. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  3. Effective Techniques for the Promotion of Library Services and Resources

    ERIC Educational Resources Information Center

    Yi, Zhixian

    2016-01-01

    Introduction: This study examines how Australian academic librarians perceive techniques for promoting services and resources, and the factors affecting the perceptions regarding effectiveness of techniques used. Method: Data were collected from an online survey that was sent to 400 academic librarians in thirty-seven Australian universities. The…

  4. Adverse effects of lingual and buccal orthodontic techniques: A systematic review and meta-analysis.

    PubMed

    Ata-Ali, Fadi; Ata-Ali, Javier; Ferrer-Molina, Marcela; Cobo, Teresa; De Carlos, Felix; Cobo, Juan

    2016-06-01

    The aim of this systematic review was to assess the prevalence of adverse effects associated with lingual and buccal fixed orthodontic techniques. Two authors searched the PubMed, EMBASE, Cochrane Library, and LILACS databases up to October 2014. Agreement between the authors was quantified by the Cohen kappa statistic. The following variables were analyzed: pain, caries, eating and speech difficulties, and oral hygiene. The Newcastle-Ottawa scale was used to assess risk of bias in nonrandomized studies, and the Cochrane Collaboration's tool for assessing risk of bias was used for randomized controlled trials. Eight articles were included in this systematic review. Meta-analysis showed a statistically greater risk of pain of the tongue (odds ratio [OR], 28.32; 95% confidence interval [95% CI], 8.60-93.28; P <0.001), cheeks (OR, 0.087; 95% CI, 0.036-0.213; P <0.0010), and lips (OR, 0.13; 95% CI, 0.04-0.39; P <0.001), as well as for the variables of speech difficulties (OR, 9.39; 95% CI, 3.78-23.33; P <0.001) and oral hygiene (OR, 3.49; 95% CI, 1.02-11.95; P = 0.047) with lingual orthodontics. However, no statistical difference was found with respect to eating difficulties (OR, 3.74; 95% CI, 0.86-16.28; P = 0.079) and caries (OR, 1.15; 95% CI, 0.17-7.69; P = 0.814 [Streptococcus mutans] and OR, 0.67; 95% CI, 0.20-2.23; P = 0.515 [Lactobacillus]). This systematic review suggests that patients wearing lingual appliances have more pain, speech difficulties, and problems in maintaining adequate oral hygiene, although no differences for eating and caries risk were identified. Further prospective studies involving larger sample sizes and longer follow-up periods are needed to confirm these results. Copyright © 2016 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  5. Single cell and single molecule techniques for the analysis of the epigenome

    NASA Astrophysics Data System (ADS)

    Wallin, Christopher Benjamin

    materialized. Reasons for this, including poor signal to background, are explained in detail. Third, development of mobility-SCAN, an analytical technique for measuring and analyzing single molecules based on their fluorescent signature and their electrophoretic mobility in nanochannels is described. We use the technique to differentiate biomolecules from complex mixtures and derive parameters such as diffusion coefficients and effective charges. Finally, the device is used to detect binding interactions of various complexes similar to affinity capillary electrophoresis, but on a single molecule level. Fourth, we conclude by briefly discussing SCAN-sort, a technique to sort individual chromatin molecules based on their fluorescent emissions for further downstream analysis such as DNA sequencing. We demonstrate a 2-fold enrichment of chromatin from sorting and discuss possible system modifications for better performance in the future.

  6. Failure Modes and Effects Analysis (FMEA): A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Failure modes and effects analysis (FMEA) is a bottom-up analytical process that identifies process hazards, which helps managers understand vulnerabilities of systems, as well as assess and mitigate risk. It is one of several engineering tools and techniques available to program and project managers aimed at increasing the likelihood of safe and successful NASA programs and missions. This bibliography references 465 documents in the NASA STI Database that contain the major concepts, failure modes or failure analysis, in either the basic index of the major subject terms.

  7. Mathematical analysis techniques for modeling the space network activities

    NASA Technical Reports Server (NTRS)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  8. Recent Advance in Liquid Chromatography/Mass Spectrometry Techniques for Environmental Analysis in Japan

    PubMed Central

    Suzuki, Shigeru

    2014-01-01

    The techniques and measurement methods developed in the Environmental Survey and Monitoring of Chemicals by Japan’s Ministry of the Environment, as well as a large amount of knowledge archived in the survey, have led to the advancement of environmental analysis. Recently, technologies such as non-target liquid chromatography/high resolution mass spectrometry and liquid chromatography with micro bore column have further developed the field. Here, the general strategy of a method developed for the liquid chromatography/mass spectrometry (LC/MS) analysis of environmental chemicals with a brief description is presented. Also, a non-target analysis for the identification of environmental pollutants using a provisional fragment database and “MsMsFilter,” an elemental composition elucidation tool, is presented. This analytical method is shown to be highly effective in the identification of a model chemical, the pesticide Bendiocarb. Our improved micro-liquid chromatography injection system showed substantially enhanced sensitivity to perfluoroalkyl substances, with peak areas 32–71 times larger than those observed in conventional LC/MS. PMID:26819891

  9. Signal-to-noise ratio analysis and evaluation of the Hadamard imaging technique

    NASA Technical Reports Server (NTRS)

    Jobson, D. J.; Katzberg, S. J.; Spiers, R. B., Jr.

    1977-01-01

    The signal-to-noise ratio performance of the Hadamard imaging technique is analyzed and an experimental evaluation of a laboratory Hadamard imager is presented. A comparison between the performances of Hadamard and conventional imaging techniques shows that the Hadamard technique is superior only when the imaging objective lens is required to have an effective F (focus) number of about 2 or slower.

  10. Cost-effectiveness of modern radiotherapy techniques in locally advanced pancreatic cancer.

    PubMed

    Murphy, James D; Chang, Daniel T; Abelson, Jon; Daly, Megan E; Yeung, Heidi N; Nelson, Lorene M; Koong, Albert C

    2012-02-15

    Radiotherapy may improve the outcome of patients with pancreatic cancer but at an increased cost. In this study, the authors evaluated the cost-effectiveness of modern radiotherapy techniques in the treatment of locally advanced pancreatic cancer. A Markov decision-analytic model was constructed to compare the cost-effectiveness of 4 treatment regimens: gemcitabine alone, gemcitabine plus conventional radiotherapy, gemcitabine plus intensity-modulated radiotherapy (IMRT); and gemcitabine with stereotactic body radiotherapy (SBRT). Patients transitioned between the following 5 health states: stable disease, local progression, distant failure, local and distant failure, and death. Health utility tolls were assessed for radiotherapy and chemotherapy treatments and for radiation toxicity. SBRT increased life expectancy by 0.20 quality-adjusted life years (QALY) at an increased cost of $13,700 compared with gemcitabine alone (incremental cost-effectiveness ratio [ICER] = $69,500 per QALY). SBRT was more effective and less costly than conventional radiotherapy and IMRT. An analysis that excluded SBRT demonstrated that conventional radiotherapy had an ICER of $126,800 per QALY compared with gemcitabine alone, and IMRT had an ICER of $1,584,100 per QALY compared with conventional radiotherapy. A probabilistic sensitivity analysis demonstrated that the probability of cost-effectiveness at a willingness to pay of $50,000 per QALY was 78% for gemcitabine alone, 21% for SBRT, 1.4% for conventional radiotherapy, and 0.01% for IMRT. At a willingness to pay of $200,000 per QALY, the probability of cost-effectiveness was 73% for SBRT, 20% for conventional radiotherapy, 7% for gemcitabine alone, and 0.7% for IMRT. The current results indicated that IMRT in locally advanced pancreatic cancer exceeds what society considers cost-effective. In contrast, combining gemcitabine with SBRT increased clinical effectiveness beyond that of gemcitabine alone at a cost potentially acceptable by

  11. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP Techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    Author identified significant preliminary results from the Ouachita portion of the Texoma frame of data indicate many potentials in the analysis and interpretation of ERTS data. It is believed that one of the more significant aspects of this analysis sequence has been the investigation of a technique to relate ERTS analysis and surface observation analysis. At present a sequence involving (1) preliminary analysis based solely upon the spectral characteristics of the data, followed by (2) a surface observation mission to obtain visual information and oblique photography to particular points of interest in the test site area, appears to provide an extremely efficient technique for obtaining particularly meaningful surface observation data. Following such a procedure permits concentration on particular points of interest in the entire ERTS frame and thereby makes the surface observation data obtained to be particularly significant and meaningful. The analysis of the Texoma frame has also been significant from the standpoint of demonstrating a fast turn around analysis capability. Additionally, the analysis has shown the potential accuracy and degree of complexity of features that can be identified and mapped using ERTS data.

  12. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    NASA Astrophysics Data System (ADS)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  13. Stability analysis of nonlinear Roesser-type two-dimensional systems via a homogenous polynomial technique

    NASA Astrophysics Data System (ADS)

    Zhang, Tie-Yan; Zhao, Yan; Xie, Xiang-Peng

    2012-12-01

    This paper is concerned with the problem of stability analysis of nonlinear Roesser-type two-dimensional (2D) systems. Firstly, the fuzzy modeling method for the usual one-dimensional (1D) systems is extended to the 2D case so that the underlying nonlinear 2D system can be represented by the 2D Takagi—Sugeno (TS) fuzzy model, which is convenient for implementing the stability analysis. Secondly, a new kind of fuzzy Lyapunov function, which is a homogeneous polynomially parameter dependent on fuzzy membership functions, is developed to conceive less conservative stability conditions for the TS Roesser-type 2D system. In the process of stability analysis, the obtained stability conditions approach exactness in the sense of convergence by applying some novel relaxed techniques. Moreover, the obtained result is formulated in the form of linear matrix inequalities, which can be easily solved via standard numerical software. Finally, a numerical example is also given to demonstrate the effectiveness of the proposed approach.

  14. Dimension reduction techniques for the integrative analysis of multi-omics data

    PubMed Central

    Zeleznik, Oana A.; Thallinger, Gerhard G.; Kuster, Bernhard; Gholami, Amin M.

    2016-01-01

    State-of-the-art next-generation sequencing, transcriptomics, proteomics and other high-throughput ‘omics' technologies enable the efficient generation of large experimental data sets. These data may yield unprecedented knowledge about molecular pathways in cells and their role in disease. Dimension reduction approaches have been widely used in exploratory analysis of single omics data sets. This review will focus on dimension reduction approaches for simultaneous exploratory analyses of multiple data sets. These methods extract the linear relationships that best explain the correlated structure across data sets, the variability both within and between variables (or observations) and may highlight data issues such as batch effects or outliers. We explore dimension reduction techniques as one of the emerging approaches for data integration, and how these can be applied to increase our understanding of biological systems in normal physiological function and disease. PMID:26969681

  15. Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique

    NASA Technical Reports Server (NTRS)

    Maise, G.; Rossi, M. J.

    1974-01-01

    A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.

  16. Fractographic ceramic failure analysis using the replica technique

    PubMed Central

    Scherrer, Susanne S.; Quinn, Janet B.; Quinn, George D.; Anselm Wiskott, H. W.

    2007-01-01

    Objectives To demonstrate the effectiveness of in vivo replicas of fractured ceramic surfaces for descriptive fractography as applied to the analysis of clinical failures. Methods The fracture surface topography of partially failed veneering ceramic of a Procera Alumina molar and an In Ceram Zirconia premolar were examined utilizing gold-coated epoxy poured replicas viewed using scanning electron microscopy. The replicas were inspected for fractographic features such as hackle, wake hackle, twist hackle, compression curl and arrest lines for determination of the direction of crack propagation and location of the origin. Results For both veneering ceramics, replicas provided an excellent reproduction of the fractured surfaces. Fine details including all characteristic fracture features produced by the interaction of the advancing crack with the material's microstructure could be recognized. The observed features are indicators of the local direction of crack propagation and were used to trace the crack's progression back to its initial starting zone (the origin). Drawbacks of replicas such as artifacts (air bubbles) or imperfections resulting from inadequate epoxy pouring were noted but not critical for the overall analysis of the fractured surfaces. Significance The replica technique proved to be easy to use and allowed an excellent reproduction of failed ceramic surfaces. It should be applied before attempting to remove any failed part remaining in situ as the fracture surface may be damaged during this procedure. These two case studies are intended as an introduction for the clinical researcher in using qualitative (descriptive) fractography as a tool for understanding fracture processes in brittle restorative materials and, secondarily, to draw conclusions as to possible design inadequacies in failed restorations. PMID:17270267

  17. Preliminary assessment of aerial photography techniques for canvasback population analysis

    USGS Publications Warehouse

    Munro, R.E.; Trauger, D.L.

    1976-01-01

    Recent intensive research on the canvasback has focused attention on the need for more precise estimates of population parameters. During the 1972-75 period, various types of aerial photographing equipment were evaluated to determine the problems and potentials for employing these techniques in appraisals of canvasback populations. The equipment and procedures available for automated analysis of aerial photographic imagery were also investigated. Serious technical problems remain to be resolved, but some promising results were obtained. Final conclusions about the feasibility of operational implementation await a more rigorous analysis of the data collected.

  18. Data and techniques for studying the urban heat island effect in Johannesburg

    NASA Astrophysics Data System (ADS)

    Hardy, C. H.; Nel, A. L.

    2015-04-01

    The city of Johannesburg contains over 10 million trees and is often referred to as an urban forest. The intra-urban spatial variability of the levels of vegetation across Johannesburg's residential regions has an influence on the urban heat island effect within the city. Residential areas with high levels of vegetation benefit from cooling due to evapo-transpirative processes and thus exhibit weaker heat island effects; while their impoverished counterparts are not so fortunate. The urban heat island effect describes a phenomenon where some urban areas exhibit temperatures that are warmer than that of surrounding areas. The factors influencing the urban heat island effect include the high density of people and buildings and low levels of vegetative cover within populated urban areas. This paper describes the remote sensing data sets and the processing techniques employed to study the heat island effect within Johannesburg. In particular we consider the use of multi-sensorial multi-temporal remote sensing data towards a predictive model, based on the analysis of influencing factors.

  19. Discrete ordinates-Monte Carlo coupling: A comparison of techniques in NERVA radiation analysis

    NASA Technical Reports Server (NTRS)

    Lindstrom, D. G.; Normand, E.; Wilcox, A. D.

    1972-01-01

    In the radiation analysis of the NERVA nuclear rocket system, two-dimensional discrete ordinates calculations are sufficient to provide detail in the pressure vessel and reactor assembly. Other parts of the system, however, require three-dimensional Monte Carlo analyses. To use these two methods in a single analysis, a means of coupling was developed whereby the results of a discrete ordinates calculation can be used to produce source data for a Monte Carlo calculation. Several techniques for producing source detail were investigated. Results of calculations on the NERVA system are compared and limitations and advantages of the coupling techniques discussed.

  20. Laser fiber cleaving techniques: effects on tip morphology and power output.

    PubMed

    Vassantachart, Janna M; Lightfoot, Michelle; Yeo, Alexander; Maldonado, Jonathan; Li, Roger; Alsyouf, Muhannad; Martin, Jacob; Lee, Michael; Olgin, Gaudencio; Baldwin, D Duane

    2015-01-01

    Proper cleaving of reusable laser fibers is needed to maintain optimal functionality. This study quantifies the effect of different cleaving tools on power output of the holmium laser fiber and demonstrates morphologic changes using microscopy. The uncleaved tips of new 272 μm reusable laser fibers were used to obtain baseline power transmission values at 3 W (0.6 J, 5 Hz). Power output for each of four cleaving techniques-11-blade scalpel, scribe pen cleaving tool, diamond cleaving wheel, and suture scissors-was measured in a single-blinded fashion. Dispersion of light from the fibers was compared with manufacturer specifications and rated as "ideal," "acceptable," or "unacceptable" by blinded reviewers. The fiber tips were also imaged using confocal and scanning electron microscopy. Independent samples Kruskal-Wallis test and chi square were used for statistical analysis (α<0.05). New uncleaved fiber tips transmitted 3.04 W of power and were used as a reference (100%). The scribe pen cleaving tool produced the next highest output (97.1%), followed by the scalpel (83.4%), diamond cleaving wheel (77.1%), and suture scissors (61.7%), a trend that was highly significant (P<0.001). On pairwise comparison, no difference in power output was seen between the uncleaved fiber tips and those cleaved with the scribe pen (P=1.0). The rating of the light dispersion patterns from the different cleaving methods followed the same trend as the power output results (P<0.001). Microscopy showed that the scribe pen produced small defects along the fiber cladding but maintained a smooth, flat core surface. The other cleaving techniques produced defects on both the core and cladding. Cleaving techniques produce a significant effect on the initial power transmitted by reusable laser fibers. The scribe pen cleaving tool produced the most consistent and highest average power output.

  1. Differences in head impulse test results due to analysis techniques.

    PubMed

    Cleworth, Taylor W; Carpenter, Mark G; Honegger, Flurin; Allum, John H J

    2017-01-01

    Different analysis techniques are used to define vestibulo-ocular reflex (VOR) gain between eye and head angular velocity during the video head impulse test (vHIT). Comparisons would aid selection of gain techniques best related to head impulse characteristics and promote standardisation. Compare and contrast known methods of calculating vHIT VOR gain. We examined lateral canal vHIT responses recorded from 20 patients twice within 13 weeks of acute unilateral peripheral vestibular deficit onset. Ten patients were tested with an ICS Impulse system (GN Otometrics) and 10 with an EyeSeeCam (ESC) system (Interacoustics). Mean gain and variance were computed with area, average sample gain, and regression techniques over specific head angular velocity (HV) and acceleration (HA) intervals. Results for the same gain technique were not different between measurement systems. Area and average sample gain yielded equally lower variances than regression techniques. Gains computed over the whole impulse duration were larger than those computed for increasing HV. Gain over decreasing HV was associated with larger variances. Gains computed around peak HV were smaller than those computed around peak HA. The median gain over 50-70 ms was not different from gain around peak HV. However, depending on technique used, the gain over increasing HV was different from gain around peak HA. Conversion equations between gains obtained with standard ICS and ESC methods were computed. For low gains, the conversion was dominated by a constant that needed to be added to ESC gains to equal ICS gains. We recommend manufacturers standardize vHIT gain calculations using 2 techniques: area gain around peak HA and peak HV.

  2. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    USDA-ARS?s Scientific Manuscript database

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  3. Is There a Cosmetic Advantage to Single-Incision Laparoscopic Surgical Techniques Over Standard Laparoscopic Surgery? A Systematic Review and Meta-analysis.

    PubMed

    Evans, Luke; Manley, Kate

    2016-06-01

    Single-incision laparoscopic surgery represents an evolution of minimally invasive techniques, but has been a controversial development. A cosmetic advantage is stated by many authors, but has not been found to be universally present or even of considerable importance by patients. This systematic review and meta-analysis demonstrates that there is a cosmetic advantage of the technique regardless of the operation type. The treatment effect in terms of cosmetic improvement is of the order of 0.63.

  4. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials.

    PubMed

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-12

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis.

  5. Improved Sectional Image Analysis Technique for Evaluating Fiber Orientations in Fiber-Reinforced Cement-Based Materials

    PubMed Central

    Lee, Bang Yeon; Kang, Su-Tae; Yun, Hae-Bum; Kim, Yun Yong

    2016-01-01

    The distribution of fiber orientation is an important factor in determining the mechanical properties of fiber-reinforced concrete. This study proposes a new image analysis technique for improving the evaluation accuracy of fiber orientation distribution in the sectional image of fiber-reinforced concrete. A series of tests on the accuracy of fiber detection and the estimation performance of fiber orientation was performed on artificial fiber images to assess the validity of the proposed technique. The validation test results showed that the proposed technique estimates the distribution of fiber orientation more accurately than the direct measurement of fiber orientation by image analysis. PMID:28787839

  6. Noncontact techniques for diesel engine diagnostics using exhaust waveform analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, D.A.; Cooke, G.J.

    1987-01-01

    RCA Corporation's continuing efforts to develop noncontact test techniques for diesel engines have led to recent advancements in deep engine diagnostics. The U.S. Army Tank-Automotive Command (TACOM) has been working with RCA for the development of new noncontact sensors and test techniques which use these sensors in conjunction with their family of Simplified Test Equipment (STE) to perform vehicle diagnostics. The STE systems are microprocessor-based maintenance tools that assist the Army mechanic in diagnosing malfunctions in both tactical and combat vehicles. The test systems support the mechanic by providing the sophisticated signal processing capabilities necessary for a wide range ofmore » diagnostic testing including exhaust waveform analysis.« less

  7. Sampling and analysis techniques for monitoring serum for trace elements.

    PubMed

    Ericson, S P; McHalsky, M L; Rabinow, B E; Kronholm, K G; Arceo, C S; Weltzer, J A; Ayd, S W

    1986-07-01

    We describe techniques for controlling contamination in the sampling and analysis of human serum for trace metals. The relatively simple procedures do not require clean-room conditions. The atomic absorption and atomic emission methods used have been applied in studying zinc, copper, chromium, manganese, molybdenum, selenium, and aluminum concentrations. Values obtained for a group of 16 normal subjects agree with the most reliable values reported in the literature, obtained by much more elaborate techniques. All of these metals can be measured in 3 to 4 mL of serum. The methods may prove especially useful in monitoring concentrations of essential trace elements in blood of patients being maintained on total parenteral nutrition.

  8. Effect of various putty-wash impression techniques on marginal fit of cast crowns.

    PubMed

    Nissan, Joseph; Rosner, Ofir; Bukhari, Mohammed Amin; Ghelfan, Oded; Pilo, Raphael

    2013-01-01

    Marginal fit is an important clinical factor that affects restoration longevity. The accuracy of three polyvinyl siloxane putty-wash impression techniques was compared by marginal fit assessment using the nondestructive method. A stainless steel master cast containing three abutments with three metal crowns matching the three preparations was used to make 45 impressions: group A = single-step technique (putty and wash impression materials used simultaneously), group B = two-step technique with a 2-mm relief (putty as a preliminary impression to create a 2-mm wash space followed by the wash stage), and group C = two-step technique with a polyethylene spacer (plastic spacer used with the putty impression followed by the wash stage). Accuracy was assessed using a toolmaker microscope to measure and compare the marginal gaps between each crown and finish line on the duplicated stone casts. Each abutment was further measured at the mesial, buccal, and distal aspects. One-way analysis of variance was used for statistical analysis. P values and Scheffe post hoc contrasts were calculated. Significance was determined at .05. One-way analysis of variance showed significant differences among the three impression techniques in all three abutments and at all three locations (P < .001). Group B yielded dies with minimal gaps compared to groups A and C. The two-step impression technique with 2-mm relief was the most accurate regarding the crucial clinical factor of marginal fit.

  9. Tools and techniques for estimating high intensity RF effects

    NASA Astrophysics Data System (ADS)

    Zacharias, Richard L.; Pennock, Steve T.; Poggio, Andrew J.; Ray, Scott L.

    1992-01-01

    Tools and techniques for estimating and measuring coupling and component disturbance for avionics and electronic controls are described. A finite-difference-time-domain (FD-TD) modeling code, TSAR, used to predict coupling is described. This code can quickly generate a mesh model to represent the test object. Some recent applications as well as the advantages and limitations of using such a code are described. Facilities and techniques for making low-power coupling measurements and for making direct injection test measurements of device disturbance are also described. Some scaling laws for coupling and device effects are presented. A method for extrapolating these low-power test results to high-power full-system effects are presented.

  10. Counseling about turbuhaler technique: needs assessment and effective strategies for community pharmacists.

    PubMed

    Basheti, Iman A; Reddel, Helen K; Armour, Carol L; Bosnic-Anticevich, Sinthia Z

    2005-05-01

    Optimal effects of asthma medications are dependent on correct inhaler technique. In a telephone survey, 77/87 patients reported that their Turbuhaler technique had not been checked by a health care professional. In a subsequent pilot study, 26 patients were randomized to receive one of 3 Turbuhaler counseling techniques, administered in the community pharmacy. Turbuhaler technique was scored before and 2 weeks after counseling (optimal technique = score 9/9). At baseline, 0/26 patients had optimal technique. After 2 weeks, optimal technique was achieved by 0/7 patients receiving standard verbal counseling (A), 2/8 receiving verbal counseling augmented with emphasis on Turbuhaler position during priming (B), and 7/9 receiving augmented verbal counseling plus physical demonstration (C) (Fisher's exact test for A vs C, p = 0.006). Satisfactory technique (4 essential steps correct) also improved (A: 3/8 to 4/7; B: 2/9 to 5/8; and C: 1/9 to 9/9 patients) (A vs C, p = 0.1). Counseling in Turbuhaler use represents an important opportunity for community pharmacists to improve asthma management, but physical demonstration appears to be an important component to effective Turbuhaler training for educating patients toward optimal Turbuhaler technique.

  11. SU-F-T-248: FMEA Risk Analysis Implementation (AAPM TG-100) in Total Skin Electron Irradiation Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ibanez-Rosello, B; Bautista-Ballesteros, J; Bonaque, J

    2016-06-15

    Purpose: Total Skin Electron Irradiation (TSEI) is a radiotherapy treatment which involves irradiating the entire body surface as homogeneously as possible. It is composed of an extensive multi-step technique in which quality management requires high consumption of resources and a fluid communication between the involved staff, necessary to improve the safety of treatment. The TG-100 proposes a new perspective of quality management in radiotherapy, presenting a systematic method of risk analysis throughout the global flow of the stages through the patient. The purpose of this work has been to apply TG-100 approach to the TSEI procedure in our institution. Methods:more » A multidisciplinary team specifically targeting TSEI procedure was formed, that met regularly and jointly developed the process map (PM), following TG-100 guidelines of the AAPM. This PM is a visual representation of the temporal flow of steps through the patient since start until the end of his stay in the radiotherapy service. Results: This is the first stage of the full risk analysis, which is being carried out in the center. The PM provides an overview of the process and facilitates the understanding of the team members who will participate in the subsequent analysis. Currently, the team is implementing the analysis of failure modes and effects (FMEA). The failure modes of each of the steps have been identified and assessors are assigning a value of severity (S), frequency of occurrence (O) and lack of detection (D) individually. To our knowledge, this is the first PM made for the TSEI. The developed PM can be useful for those centers that intend to implement the TSEI technique. Conclusion: The PM of TSEI technique has been established, as the first stage of full risk analysis, performed in a reference center in this treatment.« less

  12. ANALYSIS OF METHODS FOR DETECTING THE PROXIMITY EFFECT IN QUASAR SPECTRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall'Aglio, Aldo; Gnedin, Nickolay Y., E-mail: adaglio@aip.d

    Using numerical simulations of structure formation, we investigate several methods for determining the strength of the proximity effect in the H I Ly{alpha} forest. We analyze three high-resolution ({approx}10 kpc) redshift snapshots (z-bar=4,3, and 2.25) of a Hydro-Particle-Mesh simulation to obtain realistic absorption spectra of the H I Ly{alpha} forest. We model the proximity effect along the simulated sight lines with a simple analytical prescription based on the assumed quasar luminosity and the intensity of the cosmic UV background (UVB). We begin our analysis investigating the intrinsic biases thought to arise in the widely adopted standard technique of combining multiplemore » lines of sight when searching for the proximity effect. We confirm the existence of these biases, albeit smaller than previously predicted with simple Monte Carlo simulations. We then concentrate on the analysis of the proximity effect along individual lines of sight. After determining its strength with a fiducial value of the UVB intensity, we construct the proximity effect strength distribution (PESD). We confirm that the PESD inferred from the simple averaging technique accurately recovers the input strength of the proximity effect at all redshifts. Moreover, the PESD closely follows the behaviors found in observed samples of quasar spectra. However, the PESD obtained from our new simulated sight lines presents some differences to that of simple Monte Carlo simulations. At all redshifts, we find a smaller dispersion of the strength parameters, the source of the corresponding smaller biases found when combining multiple lines of sight. After developing three new theoretical methods for recovering the strength of the proximity effect on individual lines of sight, we compare their accuracy to the PESD from the simple averaging technique. All our new approaches are based on the maximization of the likelihood function, albeit invoking some modifications. The new techniques presented

  13. "Soft"or "hard" ionisation? Investigation of metastable gas temperature effect on direct analysis in real-time analysis of Voriconazole.

    PubMed

    Lapthorn, Cris; Pullen, Frank

    2009-01-01

    The performance of the direct analysis in real-time (DART) technique was evaluated across a range of metastable gas temperatures for a pharmaceutical compound, Voriconazole, in order to investigate the effect of metastable gas temperature on molecular ion intensity and fragmentation. The DART source has been used to analyse a range of analytes and from a range of matrices including drugs in solid tablet form and preparations, active ingredients in ointment, naturally occurring plant alkaloids, flavours and fragrances, from thin layer chromatography (TLC) plates, melting point tubes and biological matrices including hair, urine and blood. The advantages of this technique include rapid analysis time (as little as 5 s), a reduction in sample preparation requirements, elimination of mobile phase requirement and analysis of samples not typically amenable to atmospheric pressure ionisation (API) techniques. This technology has therefore been proposed as an everyday tool for identification of components in crude organic reaction mixtures.

  14. Behavior Change Techniques in Apps for Medication Adherence: A Content Analysis.

    PubMed

    Morrissey, Eimear C; Corbett, Teresa K; Walsh, Jane C; Molloy, Gerard J

    2016-05-01

    There are a vast number of smartphone applications (apps) aimed at promoting medication adherence on the market; however, the theory and evidence base in terms of applying established health behavior change techniques underpinning these apps remains unclear. This study aimed to code these apps using the Behavior Change Technique Taxonomy (v1) for the presence or absence of established behavior change techniques. The sample of apps was identified through systematic searches in both the Google Play Store and Apple App Store in February 2015. All apps that fell into the search categories were downloaded for analysis. The downloaded apps were screened with exclusion criteria, and suitable apps were reviewed and coded for behavior change techniques in March 2015. Two researchers performed coding independently. In total, 166 medication adherence apps were identified and coded. The number of behavior change techniques contained in an app ranged from zero to seven (mean=2.77). A total of 12 of a possible 96 behavior change techniques were found to be present across apps. The most commonly included behavior change techniques were "action planning" and "prompt/cues," which were included in 96% of apps, followed by "self-monitoring" (37%) and "feedback on behavior" (36%). The current extent to which established behavior change techniques are used in medication adherence apps is limited. The development of medication adherence apps may not have benefited from advances in the theory and practice of health behavior change. Copyright © 2016 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  15. Effects of Enhancement Techniques on L2 Incidental Vocabulary Learning

    ERIC Educational Resources Information Center

    Duan, Shiping

    2018-01-01

    Enhancement Techniques are conducive to incidental vocabulary learning. This study investigated the effects of two types of enhancement techniques-multiple-choice glosses (MC) and L1 single-gloss (SG) on L2 incidental learning of new words and retention of them. A total of 89 university learners of English as a Foreign Language (EFL) were asked to…

  16. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  17. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  18. Evaluation of SLAR and thematic mapper MSS data for forest cover mapping using computer-aided analysis techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1980-01-01

    The column normalizing technique was used to adjust the data for variations in the amplitude of the signal due to look angle effects with respect to solar zenith angle along the scan lines (i.e., across columns). Evaluation of the data set containing the geometric and radiometric adjustments, indicates that the data set should be satisfactory for further processing and analysis. Software was developed for degrading the spatial resolution of the aircraft data to produce a total of four data sets for further analysis. The quality of LANDSAT 2 CCT data for the test site is good for channels four, five, and six. Channel seven was not present on the tape. The data received were reformatted and analysis of the test site area was initiated.

  19. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    NASA Astrophysics Data System (ADS)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  20. Revealing the beneficial effect of protease supplementation to high gravity beer fermentations using "-omics" techniques

    PubMed Central

    2011-01-01

    Background Addition of sugar syrups to the basic wort is a popular technique to achieve higher gravity in beer fermentations, but it results in dilution of the free amino nitrogen (FAN) content in the medium. The multicomponent protease enzyme Flavourzyme has beneficial effect on the brewer's yeast fermentation performance during high gravity fermentations as it increases the initial FAN value and results in higher FAN uptake, higher specific growth rate, higher ethanol yield and improved flavour profile. Results In the present study, transcriptome and metabolome analysis were used to elucidate the effect on the addition of the multicomponent protease enzyme Flavourzyme and its influence on the metabolism of the brewer's yeast strain Weihenstephan 34/70. The study underlines the importance of sufficient nitrogen availability during the course of beer fermentation. The applied metabolome and transcriptome analysis allowed mapping the effect of the wort sugar composition on the nitrogen uptake. Conclusion Both the transcriptome and the metabolome analysis revealed that there is a significantly higher impact of protease addition for maltose syrup supplemented fermentations, while addition of glucose syrup to increase the gravity in the wort resulted in increased glucose repression that lead to inhibition of amino acid uptake and hereby inhibited the effect of the protease addition. PMID:21513553

  1. Crystallization kinetics of a lithia-silica glass - Effect of sample characteristics and thermal analysis measurement techniques

    NASA Technical Reports Server (NTRS)

    Ray, Chandra S.; Huang, Wenhai; Day, Delbert E.

    1991-01-01

    DTA and both isothermal and nonisothermal DSC techniques are presently used to investigate the crystallization kinetics of a 40 (mol) percent Li2O-60 percent SiO2 glass as a function of glass powder particle size, the use of either alumina or Pt as the crucible material, the use of N, O, or Ar atmospheres, and surface pretreatments of the glass powder with deionized water, HCl, or HF. Neither the furnace atmosphere nor the crucible material had a significant effect on activation energy, frequency factor, or Avrami exponent. Washings of the glass with the three different fluids decreased the crystallization temperature by 25 to 30 C.

  2. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    NASA Astrophysics Data System (ADS)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  3. The Immersive Virtual Reality Experience: A Typology of Users Revealed Through Multiple Correspondence Analysis Combined with Cluster Analysis Technique.

    PubMed

    Rosa, Pedro J; Morais, Diogo; Gamito, Pedro; Oliveira, Jorge; Saraiva, Tomaz

    2016-03-01

    Immersive virtual reality is thought to be advantageous by leading to higher levels of presence. However, and despite users getting actively involved in immersive three-dimensional virtual environments that incorporate sound and motion, there are individual factors, such as age, video game knowledge, and the predisposition to immersion, that may be associated with the quality of virtual reality experience. Moreover, one particular concern for users engaged in immersive virtual reality environments (VREs) is the possibility of side effects, such as cybersickness. The literature suggests that at least 60% of virtual reality users report having felt symptoms of cybersickness, which reduces the quality of the virtual reality experience. The aim of this study was thus to profile the right user to be involved in a VRE through head-mounted display. To examine which user characteristics are associated with the most effective virtual reality experience (lower cybersickness), a multiple correspondence analysis combined with cluster analysis technique was performed. Results revealed three distinct profiles, showing that the PC gamer profile is more associated with higher levels of virtual reality effectiveness, that is, higher predisposition to be immersed and reduced cybersickness symptoms in the VRE than console gamer and nongamer. These findings can be a useful orientation in clinical practice and future research as they help identify which users are more predisposed to benefit from immersive VREs.

  4. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1982-01-01

    Models, measures, and techniques for evaluating the effectiveness of aircraft computing systems were developed. By "effectiveness" in this context we mean the extent to which the user, i.e., a commercial air carrier, may expect to benefit from the computational tasks accomplished by a computing system in the environment of an advanced commercial aircraft. Thus, the concept of effectiveness involves aspects of system performance, reliability, and worth (value, benefit) which are appropriately integrated in the process of evaluating system effectiveness. Specifically, the primary objectives are: the development of system models that provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer.

  5. Piezoelectric Versus Conventional Rotary Techniques for Impacted Third Molar Extraction: A Meta-analysis of Randomized Controlled Trials.

    PubMed

    Jiang, Qian; Qiu, Yating; Yang, Chi; Yang, Jingyun; Chen, Minjie; Zhang, Zhiyuan

    2015-10-01

    Impacted third molars are frequently encountered in clinical work. Surgical removal of impacted third molars is often required to prevent clinical symptoms. Traditional rotary cutting instruments are potentially injurious, and piezosurgery, as a new osteotomy technique, has been introduced in oral and maxillofacial surgery. No consistent conclusion has been reached regarding whether this new technique is associated with fewer or less severe postoperative sequelae after third molar extraction.The aim of this study was to compare piezosurgery with rotary osteotomy techniques, with regard to surgery time and the severity of postoperative sequelae, including pain, swelling, and trismus.We conducted a systematic literature search in the Cochrane Library, PubMed, Embase, and Google Scholar.The eligibility criteria of this study included the following: the patients were clearly diagnosed as having impacted mandibular third molars; the patients underwent piezosurgery osteotomy, and in the control group rotary osteotomy techniques, for removing impacted third molars; the outcomes of interest include surgery time, trismus, swelling or pain; the studies are randomized controlled trials.We used random-effects models to calculate the difference in the outcomes, and the corresponding 95% confidence interval. We calculated the weighted mean difference if the trials used the same measurement, and a standardized mean difference if otherwise.A total of seven studies met the eligibility criteria and were included in our analysis. Compared with rotary osteotomy, patients undergoing piezosurgery experienced longer surgery time (mean difference 4.13 minutes, 95% confidence interval 2.75-5.52, P < 0.0001). Patients receiving the piezoelectric technique had less swelling at postoperative days 1, 3, 5, and 7 (all Ps ≤0.023). Additionally, there was a trend of less postoperative pain and trismus in the piezosurgery groups.The number of included randomized controlled trials and the

  6. Effectiveness of rotary or manual techniques for removing a 6-year-old filling material.

    PubMed

    Duarte, Marco Antônio Hungaro; Só, Marcus Vinícius Reis; Cimadon, Vanessa Buffon; Zucatto, Cristiane; Vier-Pelisser, Fabiana Vieira; Kuga, Milton Carlos

    2010-01-01

    The aim of this study was to evaluate the effectiveness of manual and rotary instrumentation techniques for removing root fillings after different storage times. Twenty-four canals from palatal roots of human maxillary molars were instrumented and filled with gutta-percha and zinc-oxide eugenol-based sealer (Endofill) , and were stored in saline for 6 years. Non-aged control specimens were treated in the same manner and stored for 1 week. All canals were retreated using hand files or ProTaper Universal NiTi rotary system. Radiographs were taken to determine the amount of remaining material in the canals. The roots were vertically split, the halves were examined with a clinical microscope and the obtained images were digitized. The images were evaluated with AutoCAD software and the percentage of residual material was calculated. Data were analyzed with two-way ANOVA and Tukey's test at 5% significance level. There was no statistically significant differences (p>0.05) between the manual and rotary techniques for filling material removal regardless the ageing effect on endodontic sealers. When only the age of the filling material was analyzed microscopically, non-aged fillings that remained on the middle third of the canals presented a higher percentage of material remaining (p<0.05) compared to the aged sealers and to the other thirds of the roots. The apical third showed a higher percentage of residual filling material in both radiographic and microscopic analysis when compared to the other root thirds. In conclusion, all canals presented residual filling material after endodontic retreatment procedures. Microscopic analysis was more effective than radiographs for detection of residual filling material.

  7. Sounding rocket thermal analysis techniques applied to GAS payloads. [Get Away Special payloads (STS)

    NASA Technical Reports Server (NTRS)

    Wing, L. D.

    1979-01-01

    Simplified analytical techniques of sounding rocket programs are suggested as a means of bringing the cost of thermal analysis of the Get Away Special (GAS) payloads within acceptable bounds. Particular attention is given to two methods adapted from sounding rocket technology - a method in which the container and payload are assumed to be divided in half vertically by a thermal plane of symmetry, and a method which considers the container and its payload to be an analogous one-dimensional unit having the real or correct container top surface area for radiative heat transfer and a fictitious mass and geometry which model the average thermal effects.

  8. A Cost Analysis of Colonoscopy using Microcosting and Time-and-motion Techniques

    PubMed Central

    Ness, Reid M.; Stiles, Renée A.; Shintani, Ayumi K.; Dittus, Robert S.

    2007-01-01

    Background The cost of an individual colonoscopy is an important determinant of the overall cost and cost-effectiveness of colorectal cancer screening. Published cost estimates vary widely and typically report institutional costs derived from gross-costing methods. Objective Perform a cost analysis of colonoscopy using micro-costing and time-and-motion techniques to determine the total societal cost of colonoscopy, which includes direct health care costs as well as direct non-health care costs and costs related to patients’ time. The design is prospective cohort. The participants were 276 contacted, eligible patients who underwent colonoscopy between July 2001 and June 2002, at either a Veterans’ Affairs Medical Center or a University Hospital in the Southeastern United States. Major results The median direct health care cost for colonoscopy was $379 (25%, 75%; $343, $433). The median direct non-health care and patient time costs were $226 (25%, 75%; $187, $323) and $274 (25%, 75%; $186, $368), respectively. The median total societal cost of colonoscopy was $923 (25%, 75%; $805, $1047). The median direct health care, direct non-health care, patient time costs, and total costs at the VA were $391, $288, $274, and $958, respectively; analogous costs at the University Hospital were $376, $189, $368, and $905, respectively. Conclusion Microcosting techniques and time-and-motion studies can produce accurate, detailed cost estimates for complex medical interventions. Cost estimates that inform health policy decisions or cost-effectiveness analyses should use total costs from the societal perspective. Societal cost estimates, which include patient and caregiver time costs, may affect colonoscopy screening rates. PMID:17665271

  9. Comparison between ultrasound guided technique and digital palpation technique for radial artery cannulation in adult patients: An updated meta-analysis of randomized controlled trials.

    PubMed

    Bhattacharjee, Sulagna; Maitra, Souvik; Baidya, Dalim K

    2018-06-01

    Possible advantages and risks associated with ultrasound guided radial artery cannulation in-comparison to digital palpation guided method in adult patients are not fully known. We have compared ultrasound guided radial artery cannulation with digital palpation technique in this meta-analysis. Meta-analysis of randomized controlled trials. Trials conducted in operating room, emergency department, cardiac catheterization laboratory. PubMed and Cochrane Central Register of Controlled Trials (CENTRAL) were searched (from 1946 to 20th November 2017) to identify prospective randomized controlled trials in adult patients. Two-dimensional ultrasound guided radial artery catheterization versus digital palpation guided radial artery cannulation. Overall cannulation success rate, first attempt success rate, time to cannulation and mean number of attempts to successful cannulation. Odds ratio (OR) and standardized mean difference (SMD) or mean difference (MD) with 95% confidence interval (CI) were calculated for categorical and continuous variables respectively. Data of 1895 patients from 10 studies have been included in this meta- analysis. Overall cannulation success rate was similar between ultrasound guided technique and digital palpation [OR (95% CI) 2.01 (1.00, 4.06); p = 0.05]. Ultrasound guided radial artery cannulation is associated with higher first attempt success rate of radial artery cannulation in comparison to digital palpation [OR (95% CI) 2.76 (186, 4.10); p < 0.001]. No difference was seen in time to cannulate [SMD (95% CI) -0.31 (-0.65, 0.04); p = 0.30] and mean number of attempt [MD (95% CI) -0.65 (-1.32, 0.02); p = 0.06] between USG guided technique with palpation technique. Radial artery cannulation by ultrasound guidance may increase the first attempt success rate but not the overall cannulation success when compared to digital palpation technique. However, results of this meta-analysis should be interpreted with caution due presence of

  10. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an

  11. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  12. Upper limb kinetic analysis of three sitting pivot wheelchair transfer techniques.

    PubMed

    Koontz, Alicia M; Kankipati, Padmaja; Lin, Yen-Sheng; Cooper, Rory A; Boninger, Michael L

    2011-11-01

    The objective of this study was to investigate differences in shoulder, elbow and hand kinetics while performing three different SPTs that varied in terms of hand and trunk positioning. Fourteen unimpaired individuals (8 male and 6 female) performed three variations of sitting pivot transfers in a random order from a wheelchair to a level tub bench. Two transfers involved a forward flexed trunk (head-hips technique) and the third with the trunk remaining upright. The two transfers involving a head hips technique were performed with two different leading hand initial positions. Motion analysis equipment recorded upper body movements and force sensors recorded hand reaction forces. Shoulder and elbow joint and hand kinetics were computed for the lift phase of the transfer. Transferring using either of the head hips techniques compared to the trunk upright style of transferring resulted in reduced superior forces at the shoulder (P<0.002), elbow (P<0.004) and hand (P<0.013). There was a significant increase in the medial forces in the leading elbow (P=0.049) for both head hip transfers and the trailing hand for the head hip technique with the arm further away from the body (P<0.028). The head hip techniques resulted in higher shoulder external rotation, flexion and extension moments compared to the trunk upright technique (P<0.021). Varying the hand placement and trunk positioning during transfers changes the load distribution across all upper limb joints. The results of this study may be useful for determining a technique that helps preserve upper limb function overtime. Published by Elsevier Ltd.

  13. Long-term effect of the insoluble thread-lifting technique.

    PubMed

    Fukaya, Mototsugu

    2017-01-01

    Although the thread-lifting technique for sagging faces has become more common and popular, medical literature evaluating its effects is scarce. Studies on its long-term prognosis are particularly uncommon. One hundred individuals who had previously undergone insoluble thread-lifting were retrospectively investigated. Photos in frontal and oblique views from the first and last visits were evaluated by six female individuals by guessing the patients' ages. The mean guessed age was defined as the apparent age, and the difference between the real and apparent ages was defined as the youth value. The difference between the youth values before and after the thread-lift was defined as the rejuvenation effect and analyzed in relation to the time since the operation, the number of threads used and the number of thread-lift operations performed. The rejuvenation effect decreased over the first year after the operation, but showed an increasing trend thereafter. The rejuvenation effect increased with the number of threads used and the number of thread-lift operations performed. The insoluble thread-lifting technique appears to be associated with both early and late effects. The rejuvenation effect appeared to decrease during the first year, but increased thereafter. A multicenter trial is necessary to confirm these findings.

  14. Effect of monitoring technique on quality of conservation science.

    PubMed

    Jewell, Zoe

    2013-06-01

    Monitoring free-ranging animals in their natural habitat is a keystone of ecosystem conservation and increasingly important in the context of current rates of loss of biological diversity. Data collected from individuals of endangered species inform conservation policies. Conservation professionals assume that these data are reliable-that the animals from whom data are collected are representative of the species in their physiology, ecology, and behavior and of the populations from which they are drawn. In the last few decades, there has been an enthusiastic adoption of invasive techniques for gathering ecological and conservation data. Although these can provide impressive quantities of data, and apparent insights into animal ranges and distributions, there is increasing evidence that these techniques can result in animal welfare problems, through the wide-ranging physiological effects of acute and chronic stress and through direct or indirect injuries or compromised movement. Much less commonly, however, do conservation scientists consider the issue of how these effects may alter the behavior of individuals to the extent that the data they collect could be unreliable. The emerging literature on the immediate and longer-term effects of capture and handling indicate it can no longer be assumed that a wild animal's survival of the process implies the safety of the procedure, that the procedure is ethical, or the scientific validity of the resulting data. I argue that conservation professionals should routinely assess study populations for negative effects of their monitoring techniques and adopt noninvasive approaches for best outcomes not only for the animals, but also for conservation science. © 2013 Society for Conservation Biology.

  15. Shielding Effectiveness in a Two-Dimensional Reverberation Chamber Using Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.

    2006-01-01

    Reverberation chambers are attaining an increased importance in determination of electromagnetic susceptibility of avionics equipment. Given the nature of the variable boundary condition, the ability of a given source to couple energy into certain modes and the passband characteristic due the chamber Q, the fields are typically characterized by statistical means. The emphasis of this work is to apply finite-element techniques at cutoff to the analysis of a two-dimensional structure to examine the notion of shielding-effectiveness issues in a reverberating environment. Simulated mechanical stirring will be used to obtain the appropriate statistical field distribution. The shielding effectiveness (SE) in a simulated reverberating environment is compared to measurements in a reverberation chamber. A log-normal distribution for the SE is observed with implications for system designers. The work is intended to provide further refinement in the consideration of SE in a complex electromagnetic environment.

  16. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    PubMed

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  17. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  18. Paper simulation techniques in user requirements analysis for interactive computer systems

    NASA Technical Reports Server (NTRS)

    Ramsey, H. R.; Atwood, M. E.; Willoughby, J. K.

    1979-01-01

    This paper describes the use of a technique called 'paper simulation' in the analysis of user requirements for interactive computer systems. In a paper simulation, the user solves problems with the aid of a 'computer', as in normal man-in-the-loop simulation. In this procedure, though, the computer does not exist, but is simulated by the experimenters. This allows simulated problem solving early in the design effort, and allows the properties and degree of structure of the system and its dialogue to be varied. The technique, and a method of analyzing the results, are illustrated with examples from a recent paper simulation exercise involving a Space Shuttle flight design task

  19. Analysis of a risk prevention document using dependability techniques: a first step towards an effectiveness model

    NASA Astrophysics Data System (ADS)

    Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc

    2018-04-01

    Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.

  20. Investigation of safety analysis methods using computer vision techniques

    NASA Astrophysics Data System (ADS)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  1. Reduction and analysis techniques for infrared imaging data

    NASA Technical Reports Server (NTRS)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  2. Singular value decomposition based feature extraction technique for physiological signal analysis.

    PubMed

    Chang, Cheng-Ding; Wang, Chien-Chih; Jiang, Bernard C

    2012-06-01

    Multiscale entropy (MSE) is one of the popular techniques to calculate and describe the complexity of the physiological signal. Many studies use this approach to detect changes in the physiological conditions in the human body. However, MSE results are easily affected by noise and trends, leading to incorrect estimation of MSE values. In this paper, singular value decomposition (SVD) is adopted to replace MSE to extract the features of physiological signals, and adopt the support vector machine (SVM) to classify the different physiological states. A test data set based on the PhysioNet website was used, and the classification results showed that using SVD to extract features of the physiological signal could attain a classification accuracy rate of 89.157%, which is higher than that using the MSE value (71.084%). The results show the proposed analysis procedure is effective and appropriate for distinguishing different physiological states. This promising result could be used as a reference for doctors in diagnosis of congestive heart failure (CHF) disease.

  3. Microplastics in sediments: A review of techniques, occurrence and effects.

    PubMed

    Van Cauwenberghe, Lisbeth; Devriese, Lisa; Galgani, François; Robbens, Johan; Janssen, Colin R

    2015-10-01

    Microplastics are omnipresent in the marine environment and sediments are hypothesized to be major sinks of these plastics. Here, over 100 articles spanning the last 50 year are reviewed with following objectives: (i) to evaluate current microplastic extraction techniques, (ii) to discuss the occurrence and worldwide distribution of microplastics in sediments, and (iii) to make a comprehensive assessment of the possible adverse effects of this type of pollution to marine organisms. Based on this review we propose future research needs and conclude that there is a clear need for a standardized techniques, unified reporting units and more realistic effect assessments. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Proprioceptive Neuromuscular Facilitation Flexibility Techniques: Acute Effects on Arterial Blood Pressure.

    ERIC Educational Resources Information Center

    Cornelius, William L.; Craft-Hamm, Kelley

    1988-01-01

    The effects of stretching techniques on arterial blood pressure (ABP) were studied in three groups of 20 men each. Each group performed one of three proprioceptive neuromuscular facilitation (PNF) techniques. Results are presented. The study indicates that the benefits of stretching may outweigh the risk of elevated ABP. (JL)

  5. Glyphosate analysis using sensors and electromigration separation techniques as alternatives to gas or liquid chromatography.

    PubMed

    Gauglitz, Günter; Wimmer, Benedikt; Melzer, Tanja; Huhn, Carolin

    2018-01-01

    Since its introduction in 1974, the herbicide glyphosate has experienced a tremendous increase in use, with about one million tons used annually today. This review focuses on sensors and electromigration separation techniques as alternatives to chromatographic methods for the analysis of glyphosate and its metabolite aminomethyl phosphonic acid. Even with the large number of studies published, glyphosate analysis remains challenging. With its polar and depending on pH even ionic functional groups lacking a chromophore, it is difficult to analyze with chromatographic techniques. Its analysis is mostly achieved after derivatization. Its purification from food and environmental samples inevitably results incoextraction of ionic matrix components, with a further impact on analysis derivatization. Its purification from food and environmental samples inevitably results in coextraction of ionic matrix components, with a further impact on analysis and also derivatization reactions. Its ability to form chelates with metal cations is another obstacle for precise quantification. Lastly, the low limits of detection required by legislation have to be met. These challenges preclude glyphosate from being analyzed together with many other pesticides in common multiresidue (chromatographic) methods. For better monitoring of glyphosate in environmental and food samples, further fast and robust methods are required. In this review, analytical methods are summarized and discussed from the perspective of biosensors and various formats of electromigration separation techniques, including modes such as capillary electrophoresis and micellar electrokinetic chromatography, combined with various detection techniques. These methods are critically discussed with regard to matrix tolerance, limits of detection reached, and selectivity.

  6. What are the most effective techniques in changing obese individuals' physical activity self-efficacy and behaviour: a systematic review and meta-analysis.

    PubMed

    Olander, Ellinor K; Fletcher, Helen; Williams, Stefanie; Atkinson, Lou; Turner, Andrew; French, David P

    2013-03-03

    Increasing self-efficacy is generally considered to be an important mediator of the effects of physical activity interventions. A previous review identified which behaviour change techniques (BCTs) were associated with increases in self-efficacy and physical activity for healthy non-obese adults. The aim of the current review was to identify which BCTs increase the self-efficacy and physical activity behaviour of obese adults. A systematic search identified 61 comparisons with obese adults reporting changes in self-efficacy towards engaging in physical activity following interventions. Of those comparisons, 42 also reported changes in physical activity behaviour. All intervention descriptions were coded using Michie et al's (2011) 40 item CALO-RE taxonomy of BCTs. Meta-analysis was conducted with moderator analyses to examine the association between whether or not each BCT was included in interventions, and size of changes in both self-efficacy and physical activity behaviour. Overall, a small effect of the interventions was found on self-efficacy (d = 0.23, 95% confidence interval (CI): 0.16-0.29, p < 0.001) and a medium sized effect on physical activity behaviour (d = 0.50, 95% CI 0.38-0.63, p < 0.001). Four BCTs were significantly associated with positive changes in self-efficacy; 'action planning', 'time management', 'prompt self-monitoring of behavioural outcome' and 'plan social support/social change'. These latter two BCTs were also associated with positive changes in physical activity. An additional 19 BCTs were associated with positive changes in physical activity. The largest effects for physical activity were found where interventions contained 'teach to use prompts/cues', 'prompt practice' or 'prompt rewards contingent on effort or progress towards behaviour'. Overall, a non-significant relationship was found between change in self-efficacy and change in physical activity (Spearman's Rho = -0.18 p = 0.72). In summary, the

  7. What are the most effective techniques in changing obese individuals’ physical activity self-efficacy and behaviour: a systematic review and meta-analysis

    PubMed Central

    2013-01-01

    Increasing self-efficacy is generally considered to be an important mediator of the effects of physical activity interventions. A previous review identified which behaviour change techniques (BCTs) were associated with increases in self-efficacy and physical activity for healthy non-obese adults. The aim of the current review was to identify which BCTs increase the self-efficacy and physical activity behaviour of obese adults. A systematic search identified 61 comparisons with obese adults reporting changes in self-efficacy towards engaging in physical activity following interventions. Of those comparisons, 42 also reported changes in physical activity behaviour. All intervention descriptions were coded using Michie et al’s (2011) 40 item CALO-RE taxonomy of BCTs. Meta-analysis was conducted with moderator analyses to examine the association between whether or not each BCT was included in interventions, and size of changes in both self-efficacy and physical activity behaviour. Overall, a small effect of the interventions was found on self-efficacy (d = 0.23, 95% confidence interval (CI): 0.16-0.29, p < 0.001) and a medium sized effect on physical activity behaviour (d = 0.50, 95% CI 0.38-0.63, p < 0.001). Four BCTs were significantly associated with positive changes in self-efficacy; ‘action planning’, ‘time management’, ‘prompt self-monitoring of behavioural outcome’ and ‘plan social support/social change’. These latter two BCTs were also associated with positive changes in physical activity. An additional 19 BCTs were associated with positive changes in physical activity. The largest effects for physical activity were found where interventions contained ‘teach to use prompts/cues’, ‘prompt practice’ or ‘prompt rewards contingent on effort or progress towards behaviour’. Overall, a non-significant relationship was found between change in self-efficacy and change in physical activity (Spearman’s Rho = −0.18 p

  8. Blood volume analysis: a new technique and new clinical interest reinvigorate a classic study.

    PubMed

    Manzone, Timothy A; Dam, Hung Q; Soltis, Daniel; Sagar, Vidya V

    2007-06-01

    Blood volume studies using the indicator dilution technique and radioactive tracers have been performed in nuclear medicine departments for over 50 y. A nuclear medicine study is the gold standard for blood volume measurement, but the classic dual-isotope blood volume study is time-consuming and can be prone to technical errors. Moreover, a lack of normal values and a rubric for interpretation made volume status measurement of limited interest to most clinicians other than some hematologists. A new semiautomated system for blood volume analysis is now available and provides highly accurate results for blood volume analysis within only 90 min. The availability of rapid, accurate blood volume analysis has brought about a surge of clinical interest in using blood volume data for clinical management. Blood volume analysis, long a low-volume nuclear medicine study all but abandoned in some laboratories, is poised to enter the clinical mainstream. This article will first present the fundamental principles of fluid balance and the clinical means of volume status assessment. We will then review the indicator dilution technique and how it is used in nuclear medicine blood volume studies. We will present an overview of the new semiautomated blood volume analysis technique, showing how the study is done, how it works, what results are provided, and how those results are interpreted. Finally, we will look at some of the emerging areas in which data from blood volume analysis can improve patient care. The reader will gain an understanding of the principles underlying blood volume assessment, know how current nuclear medicine blood volume analysis studies are performed, and appreciate their potential clinical impact.

  9. Wheeze sound analysis using computer-based techniques: a systematic review.

    PubMed

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  10. Modelling, analysis and validation of microwave techniques for the characterisation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Sulaimalebbe, Aslam

    In the last decade, the study of nanoparticle (NP) systems has become a large and interesting research area due to their novel properties and functionalities, which are different from those of the bulk materials, and also their potential applications in different fields. It is vital to understand the behaviour and properties of nano-materials aiming at implementing nanotechnology, controlling their behaviour and designing new material systems with superior performance. Physical characterisation of NPs falls into two main categories, property and structure analysis, where the properties of the NPs cannot be studied without the knowledge of size and structure. The direct measurement of the electrical properties of metal NPs presents a key challenge and necessitates the use of innovative experimental techniques. There have been numerous reports of two/four point resistance measurements of NPs films and also electrical conductivity of NPs films using the interdigitated microarray (IDA) electrode. However, using microwave techniques such as open ended coaxial probe (OCP) and microwave dielectric resonator (DR) for electrical characterisation of metallic NPs are much more accurate and effective compared to other traditional techniques. This is because they are inexpensive, convenient, non-destructive, contactless, hazardless (i.e. at low power) and require no special sample preparation. This research is the first attempt to determine the microwave properties of Pt and Au NP films, which were appealing materials for nano-scale electronics, using the aforementioned microwave techniques. The ease of synthesis, relatively cheap, unique catalytic activities and control over the size and the shape were the main considerations in choosing Pt and Au NPs for the present study. The initial phase of this research was to implement and validate the aperture admittance model for the OCP measurement through experiments and 3D full wave simulation using the commercially available Ansoft

  11. Quantitative kinetic analysis of lung nodules by temporal subtraction technique in dynamic chest radiography with a flat panel detector

    NASA Astrophysics Data System (ADS)

    Tsuchiya, Yuichiro; Kodera, Yoshie; Tanaka, Rie; Sanada, Shigeru

    2007-03-01

    Early detection and treatment of lung cancer is one of the most effective means to reduce cancer mortality; chest X-ray radiography has been widely used as a screening examination or health checkup. The new examination method and the development of computer analysis system allow obtaining respiratory kinetics by the use of flat panel detector (FPD), which is the expanded method of chest X-ray radiography. Through such changes functional evaluation of respiratory kinetics in chest has become available. Its introduction into clinical practice is expected in the future. In this study, we developed the computer analysis algorithm for the purpose of detecting lung nodules and evaluating quantitative kinetics. Breathing chest radiograph obtained by modified FPD was converted into 4 static images drawing the feature, by sequential temporal subtraction processing, morphologic enhancement processing, kinetic visualization processing, and lung region detection processing, after the breath synchronization process utilizing the diaphragmatic analysis of the vector movement. The artificial neural network used to analyze the density patterns detected the true nodules by analyzing these static images, and drew their kinetic tracks. For the algorithm performance and the evaluation of clinical effectiveness with 7 normal patients and simulated nodules, both showed sufficient detecting capability and kinetic imaging function without statistically significant difference. Our technique can quantitatively evaluate the kinetic range of nodules, and is effective in detecting a nodule on a breathing chest radiograph. Moreover, the application of this technique is expected to extend computer-aided diagnosis systems and facilitate the development of an automatic planning system for radiation therapy.

  12. Effect of different impression materials and techniques on the dimensional accuracy of implant definitive casts.

    PubMed

    Ebadian, Behnaz; Rismanchian, Mansor; Dastgheib, Badrosadat; Bajoghli, Farshad

    2015-01-01

    Different factors such as impression techniques and materials can affect the passive fit between the superstructure and implant. The aim of this study was to determine the effect of different impression materials and techniques on the dimensional accuracy of implant definitive casts. Four internal hex implants (Biohorizons Ø4 mm) were placed on a metal maxillary model perpendicular to the horizontal plane in maxillary lateral incisors, right canine and left first premolar areas. Three impression techniques including open tray, closed tray using ball top screw abutments and closed tray using short impression copings and two impression materials (polyether and polyvinyl siloxane) were evaluated (n = 60). The changes in distances between implant analogues in mediolateral (x) and anteroposterior (y) directions and analogue angles in x/z and y/z directions in the horizontal plane on the definitive casts were measured by coordinate measuring machine. The data were analyzed by multivariate two-way analysis of variance and one sample t-test (α = 0.05). No statistical significant differences were observed between different impression techniques and materials. However, deviation and distortion of definitive casts had a significant difference with the master model when short impression copings and polyvinyl siloxane impression material were used (P < 0.05). In open tray technique, there was a significant difference in the rotation of analogs compared with the master model with both impression materials (P < 0.05). There was no difference between open and closed tray impression techniques; however, less distortion and deviation were observed in the open tray technique. In the closed tray impression technique, ball top screw was more accurate than short impression copings.

  13. Effect of different impression materials and techniques on the dimensional accuracy of implant definitive casts

    PubMed Central

    Ebadian, Behnaz; Rismanchian, Mansor; Dastgheib, Badrosadat; Bajoghli, Farshad

    2015-01-01

    Background: Different factors such as impression techniques and materials can affect the passive fit between the superstructure and implant. The aim of this study was to determine the effect of different impression materials and techniques on the dimensional accuracy of implant definitive casts. Materials and Methods: Four internal hex implants (Biohorizons Ø4 mm) were placed on a metal maxillary model perpendicular to the horizontal plane in maxillary lateral incisors, right canine and left first premolar areas. Three impression techniques including open tray, closed tray using ball top screw abutments and closed tray using short impression copings and two impression materials (polyether and polyvinyl siloxane) were evaluated (n = 60). The changes in distances between implant analogues in mediolateral (x) and anteroposterior (y) directions and analogue angles in x/z and y/z directions in the horizontal plane on the definitive casts were measured by coordinate measuring machine. The data were analyzed by multivariate two-way analysis of variance and one sample t-test (α = 0.05). Results: No statistical significant differences were observed between different impression techniques and materials. However, deviation and distortion of definitive casts had a significant difference with the master model when short impression copings and polyvinyl siloxane impression material were used (P < 0.05). In open tray technique, there was a significant difference in the rotation of analogs compared with the master model with both impression materials (P < 0.05). Conclusion: There was no difference between open and closed tray impression techniques; however, less distortion and deviation were observed in the open tray technique. In the closed tray impression technique, ball top screw was more accurate than short impression copings. PMID:25878678

  14. A Cross-Cultural Examination of Music Instruction Analysis and Evaluation Techniques.

    ERIC Educational Resources Information Center

    Price, Harry E.; Ogawa, Yoko; Arizumi, Koji

    1997-01-01

    Examines whether analysis techniques of student/teacher interactions widely used throughout the United Sates could be applied to a music instruction setting in Japan by analyzing two videotaped music lessons of different teachers. Finds that teacher A, who used five times more feedback, was rated higher overall by both Japanese and U.S. students.…

  15. A novel CT acquisition and analysis technique for breathing motion modeling

    NASA Astrophysics Data System (ADS)

    Low, Daniel A.; White, Benjamin M.; Lee, Percy P.; Thomas, David H.; Gaudio, Sergio; Jani, Shyam S.; Wu, Xiao; Lamb, James M.

    2013-06-01

    To report on a novel technique for providing artifact-free quantitative four-dimensional computed tomography (4DCT) image datasets for breathing motion modeling. Commercial clinical 4DCT methods have difficulty managing irregular breathing. The resulting images contain motion-induced artifacts that can distort structures and inaccurately characterize breathing motion. We have developed a novel scanning and analysis method for motion-correlated CT that utilizes standard repeated fast helical acquisitions, a simultaneous breathing surrogate measurement, deformable image registration, and a published breathing motion model. The motion model differs from the CT-measured motion by an average of 0.65 mm, indicating the precision of the motion model. The integral of the divergence of one of the motion model parameters is predicted to be a constant 1.11 and is found in this case to be 1.09, indicating the accuracy of the motion model. The proposed technique shows promise for providing motion-artifact free images at user-selected breathing phases, accurate Hounsfield units, and noise characteristics similar to non-4D CT techniques, at a patient dose similar to or less than current 4DCT techniques.

  16. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include

  17. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    the CFD load to be able to be readily applied, along with analytical and experimental variations in both the temporal and spatial fourier components of the excitation. In addition, this model is a first step in identifying response differences between transient and frequency forced response analysis techniques. The second phase assesses this difference for a much more realistic solid model of a bladed-disk in order to evaluate the effect of the spatial variation in loading on blade dominated modes. Neither research on the accuracy of the frequency response method when used in this context or a comprehensive study of the effect of test-observed variation on blade forced response have been found in the literature, so this research is a new contribution to practical structural dynamic analysis of gas turbines. The primary excitation of the upstream nozzles interacts with the blades on fuel pump of the J2X causes the 5th Nodal diameter modes to be excited, as explained by Tyler and Sofrin1, so a modal analysis was first performed on the beam/plate model and the 5ND bladed-disk mode at 40167 hz was identified and chosen to be the one excited at resonance (see figure 1). The first forced response analysis with this model focuses on identifying differences between frequency and transient response analyses. A hypothesis going into the analysis was that perhaps the frequency response was enforcing a temporal periodicity that did not really exist, and so therefore it would overestimate the response. As high dynamic response was a considerable source of stress in the J2X, examining this concept could potentially be beneficial for the program.

  18. Data Analysis Techniques for Ligo Detector Characterization

    NASA Astrophysics Data System (ADS)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  19. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    PubMed

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  20. Analysis of biochemical phase shift oscillators by a harmonic balancing technique.

    PubMed

    Rapp, P

    1976-11-25

    The use of harmonic balancing techniques for theoretically investigating a large class of biochemical phase shift oscillators is outlined and the accuracy of this approximate technique for large dimension nonlinear chemical systems is considered. It is concluded that for the equations under study these techniques can be successfully employed to both find periodic solutions and to indicate those cases which can not oscillate. The technique is a general one and it is possible to state a step by step procedure for its application. It has a substantial advantage in producing results which are immediately valid for arbitrary dimension. As the accuracy of the method increases with dimension, it complements classical small dimension methods. The results obtained by harmonic balancing analysis are compared with those obtained by studying the local stability properties of the singular points of the differential equation. A general theorem is derived which identifies those special cases where the results of first order harmonic balancing are identical to those of local stability analysis, and a necessary condition for this equivalence is derived. As a concrete example, the n-dimensional Goodwin oscillator is considered where p, the Hill coefficient of the feedback metabolite, is equal to three and four. It is shown that for p = 3 or 4 and n less than or equal to 4 the approximation indicates that it is impossible to construct a set of physically permissible reaction constants such that the system possesses a periodic solution. However for n greater than or equal to 5 it is always possible to find a large domain in the reaction constant space giving stable oscillations. A means of constructing such a parameter set is given. The results obtained here are compared with previously derived results for p = 1 and p = 2.

  1. What are the most effective intervention techniques for changing physical activity self-efficacy and physical activity behaviour--and are they the same?

    PubMed

    Williams, S L; French, D P

    2011-04-01

    There is convincing evidence that targeting self-efficacy is an effective means of increasing physical activity. However, evidence concerning which are the most effective techniques for changing self-efficacy and thereby physical activity is lacking. The present review aims to estimate the association between specific intervention techniques used in physical activity interventions and change obtained in both self-efficacy and physical activity behaviour. A systematic search yielded 27 physical activity intervention studies for 'healthy' adults that reported self-efficacy and physical activity data. A small, yet significant (P < 0.01) effect of the interventions was found on change in self-efficacy and physical activity (d = 0.16 and 0.21, respectively). When a technique was associated with a change in effect sizes for self-efficacy, it also tended to be associated with a change (r(s) = 0.690, P < 0.001) in effect size for physical activity. Moderator analyses found that 'action planning', 'provide instruction' and 'reinforcing effort towards behaviour' were associated with significantly higher levels of both self-efficacy and physical activity. 'Relapse prevention' and 'setting graded tasks' were associated with significantly lower self-efficacy and physical activity levels. This meta-analysis provides evidence for which psychological techniques are most effective for changing self-efficacy and physical activity.

  2. Application of Hyphenated Techniques in Speciation Analysis of Arsenic, Antimony, and Thallium

    PubMed Central

    Michalski, Rajmund; Szopa, Sebastian; Jabłońska, Magdalena; Łyko, Aleksandra

    2012-01-01

    Due to the fact that metals and metalloids have a strong impact on the environment, the methods of their determination and speciation have received special attention in recent years. Arsenic, antimony, and thallium are important examples of such toxic elements. Their speciation is especially important in the environmental and biomedical fields because of their toxicity, bioavailability, and reactivity. Recently, speciation analytics has been playing a unique role in the studies of biogeochemical cycles of chemical compounds, determination of toxicity and ecotoxicity of selected elements, quality control of food products, control of medicines and pharmaceutical products, technological process control, research on the impact of technological installation on the environment, examination of occupational exposure, and clinical analysis. Conventional methods are usually labor intensive, time consuming, and susceptible to interferences. The hyphenated techniques, in which separation method is coupled with multidimensional detectors, have become useful alternatives. The main advantages of those techniques consist in extremely low detection and quantification limits, insignificant interference, influence as well as high precision and repeatability of the determinations. In view of their importance, the present work overviews and discusses different hyphenated techniques used for arsenic, antimony, and thallium species analysis, in different clinical, environmental and food matrices. PMID:22654649

  3. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    NASA Astrophysics Data System (ADS)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  4. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  5. Effects of interactive instructional techniques in a web-based peripheral nervous system component for human anatomy.

    PubMed

    Allen, Edwin B; Walls, Richard T; Reilly, Frank D

    2008-02-01

    This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.

  6. The Coplane Analysis Technique for Three-Dimensional Wind Retrieval Using the HIWRAP Airborne Doppler Radar

    NASA Technical Reports Server (NTRS)

    Didlake, Anthony C., Jr.; Heymsfield, Gerald M.; Tian, Lin; Guimond, Stephen R.

    2015-01-01

    The coplane analysis technique for mapping the three-dimensional wind field of precipitating systems is applied to the NASA High Altitude Wind and Rain Airborne Profiler (HIWRAP). HIWRAP is a dual-frequency Doppler radar system with two downward pointing and conically scanning beams. The coplane technique interpolates radar measurements to a natural coordinate frame, directly solves for two wind components, and integrates the mass continuity equation to retrieve the unobserved third wind component. This technique is tested using a model simulation of a hurricane and compared to a global optimization retrieval. The coplane method produced lower errors for the cross-track and vertical wind components, while the global optimization method produced lower errors for the along-track wind component. Cross-track and vertical wind errors were dependent upon the accuracy of the estimated boundary condition winds near the surface and at nadir, which were derived by making certain assumptions about the vertical velocity field. The coplane technique was then applied successfully to HIWRAP observations of Hurricane Ingrid (2013). Unlike the global optimization method, the coplane analysis allows for a transparent connection between the radar observations and specific analysis results. With this ability, small-scale features can be analyzed more adequately and erroneous radar measurements can be identified more easily.

  7. Counselor Effectiveness As A Function of Varied Practicum Training Techniques.

    ERIC Educational Resources Information Center

    Snyder, John F.; And Others

    This study investigated the differential effects of various training techniques on the counselor effectiveness of beginning practicum students. It was hypothesized that there would be significant differences between the subjects pre- and post-test scores on both the Affective Sensitivity Scale (ASS) and on the Counselor Verbal Response Scale…

  8. Biomechanical Effect of Margin Convergence Techniques: Quantitative Assessment of Supraspinatus Muscle Stiffness.

    PubMed

    Hatta, Taku; Giambini, Hugo; Zhao, Chunfeng; Sperling, John W; Steinmann, Scott P; Itoi, Eiji; An, Kai-Nan

    2016-01-01

    Although the margin convergence (MC) technique has been recognized as an option for rotator cuff repair, little is known about the biomechanical effect on repaired rotator cuff muscle, especially after supplemented footprint repair. The purpose of this study was to assess the passive stiffness changes of the supraspinatus (SSP) muscle after MC techniques using shear wave elastography (SWE). A 30 × 40-mm U-shaped rotator cuff tear was created in 8 cadaveric shoulders. Each specimen was repaired with 6 types of MC technique (1-, 2-, 3-suture MC with/without footprint repair, in a random order) at 30° glenohumeral abduction. Passive stiffness of four anatomical regions in the SSP muscle was measured based on an established SWE method. Data were obtained from the SSP muscle at 0° abduction under 8 different conditions: intact (before making a tear), torn, and postoperative conditions with 6 techniques. MC techniques using 1-, or 2-suture combined with footprint repair showed significantly higher stiffness values than the intact condition. Passive stiffness of the SSP muscle was highest after a 1-suture MC with footprint repair for all regions when compared among all repair procedures. There was no significant difference between the intact condition and a 3-suture MC with footprint repair. MC techniques with single stitch and subsequent footprint repair may have adverse effects on muscle properties and tensile loading on repair, increasing the risk of retear of repairs. Adding more MC stitches could reverse these adverse effects.

  9. VLBI Analysis with the Multi-Technique Software GEOSAT

    NASA Technical Reports Server (NTRS)

    Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir

    2010-01-01

    GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.

  10. An Evaluation of Jordanian EFL Teachers' In-Service Training Courses Teaching Techniques Effectiveness

    ERIC Educational Resources Information Center

    AL-Wreikat, Yazan Abdel Aziz Semreen; Bin Abdullah, Muhamad Kamarul Kabilan

    2010-01-01

    This study aims to evaluate and investigate the influence of teaching techniques on the performance of English as Foreign Language (EFL) teachers by evaluating the techniques' effectiveness and actual implementation, as well as to examine the role of teachers in influencing the effectiveness of in-service training courses. A total of 798…

  11. Techniques for detecting effects of urban and rural land-use practices on stream-water chemistry in selected watersheds in Texas, Minnesota,and Illinois

    USGS Publications Warehouse

    Walker, J.F.

    1993-01-01

    Selected statistical techniques were applied to three urban watersheds in Texas and Minnesota and three rural watersheds in Illinois. For the urban watersheds, single- and paired-site data-collection strategies were considered. The paired-site strategy was much more effective than the singlesite strategy for detecting changes. Analysis of storm load regression residuals demonstrated the potential utility of regressions for variability reduction. For the rural watersheds, none of the selected techniques were effective at identifying changes, primarily due to a small degree of management-practice implementation, potential errors introduced through the estimation of storm load, and small sample sizes. A Monte Carlo sensitivity analysis was used to determine the percent change in water chemistry that could be detected for each watershed. In most instances, the use of regressions improved the ability to detect changes.

  12. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors

    PubMed Central

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-01-01

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors. PMID:28574459

  13. Analysis and Validation of Contactless Time-Gated Interrogation Technique for Quartz Resonator Sensors.

    PubMed

    Baù, Marco; Ferrari, Marco; Ferrari, Vittorio

    2017-06-02

    A technique for contactless electromagnetic interrogation of AT-cut quartz piezoelectric resonator sensors is proposed based on a primary coil electromagnetically air-coupled to a secondary coil connected to the electrodes of the resonator. The interrogation technique periodically switches between interleaved excitation and detection phases. During the excitation phase, the resonator is set into vibration by a driving voltage applied to the primary coil, whereas in the detection phase, the excitation signal is turned off and the transient decaying response of the resonator is sensed without contact by measuring the voltage induced back across the primary coil. This approach ensures that the readout frequency of the sensor signal is to a first order approximation independent of the interrogation distance between the primary and secondary coils. A detailed theoretical analysis of the interrogation principle based on a lumped-element equivalent circuit is presented. The analysis has been experimentally validated on a 4.432 MHz AT-cut quartz crystal resonator, demonstrating the accurate readout of the series resonant frequency and quality factor over an interrogation distance of up to 2 cm. As an application, the technique has been applied to the measurement of liquid microdroplets deposited on a 4.8 MHz AT-cut quartz crystal. More generally, the proposed technique can be exploited for the measurement of any physical or chemical quantities affecting the resonant response of quartz resonator sensors.

  14. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    NASA Technical Reports Server (NTRS)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  15. Analysis techniques for multivariate root loci. [a tool in linear control systems

    NASA Technical Reports Server (NTRS)

    Thompson, P. M.; Stein, G.; Laub, A. J.

    1980-01-01

    Analysis and techniques are developed for the multivariable root locus and the multivariable optimal root locus. The generalized eigenvalue problem is used to compute angles and sensitivities for both types of loci, and an algorithm is presented that determines the asymptotic properties of the optimal root locus.

  16. Symmetry-Based Techniques for Qualitative Understanding of Rovibrational Effects in Spherical-Top Molecular Spectra and Dynamics

    NASA Astrophysics Data System (ADS)

    Mitchell, Justin Chadwick

    2011-12-01

    Using light to probe the structure of matter is as natural as opening our eyes. Modern physics and chemistry have turned this art into a rich science, measuring the delicate interactions possible at the molecular level. Perhaps the most commonly used tool in computational spectroscopy is that of matrix diagonalization. While this is invaluable for calculating everything from molecular structure and energy levels to dipole moments and dynamics, the process of numerical diagonalization is an opaque one. This work applies symmetry and semi-classical techniques to elucidate numerical spectral analysis for high-symmetry molecules. Semi-classical techniques, such as the Potential Energy Surfaces, have long been used to help understand molecular vibronic and rovibronic spectra and dynamics. This investigation focuses on newer semi-classical techniques that apply Rotational Energy Surfaces (RES) to rotational energy level clustering effects in high-symmetry molecules. Such clusters exist in rigid rotor molecules as well as deformable spherical tops. This study begins by using the simplicity of rigid symmetric top molecules to clarify the classical-quantum correspondence of RES semi-classical analysis and then extends it to a more precise and complete theory of modern high-resolution spectra. RES analysis is extended to molecules having more complex and higher rank tensorial rotational and rovibrational Hamiltonians than were possible to understand before. Such molecules are shown to produce an extraordinary range of rotational level clusters, corresponding to a panoply of symmetries ranging from C4v to C2 and C1 (no symmetry) with a corresponding range of new angular momentum localization and J-tunneling effects. Using RES topography analysis and the commutation duality relations between symmetry group operators in the lab-frame to those in the body-frame, it is shown how to better describe and catalog complex splittings found in rotational level clusters. Symmetry

  17. Using the technique of computed tomography for nondestructive analysis of pharmaceutical dosage forms

    NASA Astrophysics Data System (ADS)

    de Oliveira, José Martins, Jr.; Mangini, F. Salvador; Carvalho Vila, Marta Maria Duarte; ViníciusChaud, Marco

    2013-05-01

    This work presents an alternative and non-conventional technique for evaluatingof physic-chemical properties of pharmaceutical dosage forms, i.e. we used computed tomography (CT) technique as a nondestructive technique to visualize internal structures of pharmaceuticals dosage forms and to conduct static and dynamical studies. The studies were conducted involving static and dynamic situations through the use of tomographic images, generated by the scanner at University of Sorocaba - Uniso. We have shown that through the use of tomographic images it is possible to conduct studies of porosity, densities, analysis of morphological parameters and performing studies of dissolution. Our results are in agreement with the literature, showing that CT is a powerful tool for use in the pharmaceutical sciences.

  18. Temperature analysis of laser ignited metalized material using spectroscopic technique

    NASA Astrophysics Data System (ADS)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  19. Probabilistic bias analysis in pharmacoepidemiology and comparative effectiveness research: a systematic review.

    PubMed

    Hunnicutt, Jacob N; Ulbricht, Christine M; Chrysanthopoulou, Stavroula A; Lapane, Kate L

    2016-12-01

    We systematically reviewed pharmacoepidemiologic and comparative effectiveness studies that use probabilistic bias analysis to quantify the effects of systematic error including confounding, misclassification, and selection bias on study results. We found articles published between 2010 and October 2015 through a citation search using Web of Science and Google Scholar and a keyword search using PubMed and Scopus. Eligibility of studies was assessed by one reviewer. Three reviewers independently abstracted data from eligible studies. Fifteen studies used probabilistic bias analysis and were eligible for data abstraction-nine simulated an unmeasured confounder and six simulated misclassification. The majority of studies simulating an unmeasured confounder did not specify the range of plausible estimates for the bias parameters. Studies simulating misclassification were in general clearer when reporting the plausible distribution of bias parameters. Regardless of the bias simulated, the probability distributions assigned to bias parameters, number of simulated iterations, sensitivity analyses, and diagnostics were not discussed in the majority of studies. Despite the prevalence and concern of bias in pharmacoepidemiologic and comparative effectiveness studies, probabilistic bias analysis to quantitatively model the effect of bias was not widely used. The quality of reporting and use of this technique varied and was often unclear. Further discussion and dissemination of the technique are warranted. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  20. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis.

    PubMed

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-07-23

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell.

  1. Characterization of rock populations on planetary surfaces - Techniques and a preliminary analysis of Mars and Venus

    NASA Technical Reports Server (NTRS)

    Garvin, J. B.; Mouginis-Mark, P. J.; Head, J. W.

    1981-01-01

    A data collection and analysis scheme developed for the interpretation of rock morphology from lander images is reviewed with emphasis on rock population characterization techniques. Data analysis techniques are also discussed in the context of identifying key characteristics of a rock that place it in a single category with similar rocks. Actual rock characteristics observed from Viking and Venera lander imagery are summarized. Finally, some speculations regarding the block fields on Mars and Venus are presented.

  2. Enrichment and separation techniques for large-scale proteomics analysis of the protein post-translational modifications.

    PubMed

    Huang, Junfeng; Wang, Fangjun; Ye, Mingliang; Zou, Hanfa

    2014-11-06

    Comprehensive analysis of the post-translational modifications (PTMs) on proteins at proteome level is crucial to elucidate the regulatory mechanisms of various biological processes. In the past decades, thanks to the development of specific PTM enrichment techniques and efficient multidimensional liquid chromatography (LC) separation strategy, the identification of protein PTMs have made tremendous progress. A huge number of modification sites for some major protein PTMs have been identified by proteomics analysis. In this review, we first introduced the recent progresses of PTM enrichment methods for the analysis of several major PTMs including phosphorylation, glycosylation, ubiquitination, acetylation, methylation, and oxidation/reduction status. We then briefly summarized the challenges for PTM enrichment. Finally, we introduced the fractionation and separation techniques for efficient separation of PTM peptides in large-scale PTM analysis. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Model reduction of the numerical analysis of Low Impact Developments techniques

    NASA Astrophysics Data System (ADS)

    Brunetti, Giuseppe; Šimůnek, Jirka; Wöhling, Thomas; Piro, Patrizia

    2017-04-01

    Mechanistic models have proven to be accurate and reliable tools for the numerical analysis of the hydrological behavior of Low Impact Development (LIDs) techniques. However, their widespread adoption is limited by their complexity and computational cost. Recent studies have tried to address this issue by investigating the application of new techniques, such as surrogate-based modeling. However, current results are still limited and fragmented. One of such approaches, the Model Order Reduction (MOR) technique, can represent a valuable tool for reducing the computational complexity of a numerical problems by computing an approximation of the original model. While this technique has been extensively used in water-related problems, no studies have evaluated its use in LIDs modeling. Thus, the main aim of this study is to apply the MOR technique for the development of a reduced order model (ROM) for the numerical analysis of the hydrologic behavior of LIDs, in particular green roofs. The model should be able to correctly reproduce all the hydrological processes of a green roof while reducing the computational cost. The proposed model decouples the subsurface water dynamic of a green roof in a) one-dimensional (1D) vertical flow through a green roof itself and b) one-dimensional saturated lateral flow along the impervious rooftop. The green roof is horizontally discretized in N elements. Each element represents a vertical domain, which can have different properties or boundary conditions. The 1D Richards equation is used to simulate flow in the substrate and drainage layers. Simulated outflow from the vertical domain is used as a recharge term for saturated lateral flow, which is described using the kinematic wave approximation of the Boussinesq equation. The proposed model has been compared with the mechanistic model HYDRUS-2D, which numerically solves the Richards equation for the whole domain. The HYDRUS-1D code has been used for the description of vertical flow

  4. Effect of restoration technique on stress distribution in roots with flared canals: an FEA study.

    PubMed

    Belli, Sema; Eraslan, Öznur; Eraslan, Oğuz; Eskitaşcıoğlu, Gürcan

    2014-04-01

    The aim of this finite element analysis (FEA) study was to test the effect of different restorative techniques on stress distribution in roots with flared canals. Five three-dimensional (3D) FEA models that simulated a maxillary incisor with excessive structure loss and flared root canals were created and restored with the following techniques/materials: 1) a prefabricated post: 2) one main and two accessory posts; 3) i-TFC post-core (Sun Medical); 4) the thickness of the root was increased by using composite resin and the root was then restored using a prefabricated post; 5) an anatomic post was created by using composite resin and a prefabricated glass-fiber post. Composite cores and ceramic crowns were created. A 300-N static load was applied at the center of the palatal surface of the tooth to calculate stress distributions. SolidWorks/Cosmosworks structural analysis programs were used for FEA analysis. The analysis of the von Mises and tensile stress values revealed that prefabricated post, accessory post, and i-TFC post systems showed similar stress distributions. They all showed high stress areas at the buccal side of the root (3.67 MPa) and in the cervical region of the root (> 3.67 MPa) as well as low stress accumulation within the post space (0 to 1 MPa). The anatomic post kept the stress within its body and directed less stress towards the remaining tooth structure. The creation of an anatomic post may save the remaining tooth structure in roots with flared canals by reducing the stress levels.

  5. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  6. SU-E-J-172: Bio-Physical Effects of Patients Set-Up Errors According to Whole Breast Irradiation Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S; Suh, T; Park, S

    2015-06-15

    Purpose: The dose-related effects of patient setup errors on biophysical indices were evaluated for conventional wedge (CW) and field-in-field (FIF) whole breast irradiation techniques. Methods: The treatment plans for 10 patients receiving whole left breast irradiation were retrospectively selected. Radiobiological and physical effects caused by dose variations were evaluated by shifting the isocenters and gantry angles of the treatment plans. Dose-volume histograms of the planning target volume (PTV), heart, and lungs were generated, and conformity index (CI), homogeneity index (HI), tumor control probability (TCP), and normal tissue complication probability (NTCP) were determined. Results: For “isocenter shift plan” with posterior direction,more » the D95 of the PTV decreased by approximately 15% and the TCP of the PTV decreased by approximately 50% for the FIF technique and by 40% for the CW; however, the NTCPs of the lungs and heart increased by about 13% and 1%, respectively, for both techniques. Increasing the gantry angle decreased the TCPs of the PTV by 24.4% (CW) and by 34% (FIF). The NTCPs for the two techniques differed by only 3%. In case of CW, the CIs and HIs were much higher than that of the FIF in all cases. It had a significant difference between two techniques (p<0.01). According to our results, however, the FIF had more sensitive response by set up errors rather than CW in bio-physical aspects. Conclusions: The radiobiological-based analysis can detect significant dosimetric errors then, can provide a practical patient quality assurance method to guide the radiobiological and physical effects.« less

  7. Kinematic and kinetic analysis of overhand, sidearm and underhand lacrosse shot techniques.

    PubMed

    Macaulay, Charles A J; Katz, Larry; Stergiou, Pro; Stefanyshyn, Darren; Tomaghelli, Luciano

    2017-12-01

    Lacrosse requires the coordinated performance of many complex skills. One of these skills is shooting on the opponents' net using one of three techniques: overhand, sidearm or underhand. The purpose of this study was to (i) determine which technique generated the highest ball velocity and greatest shot accuracy and (ii) identify kinematic and kinetic variables that contribute to a high velocity and high accuracy shot. Twelve elite male lacrosse players participated in this study. Kinematic data were sampled at 250 Hz, while two-dimensional force plates collected ground reaction force data (1000 Hz). Statistical analysis showed significantly greater ball velocity for the sidearm technique than overhand (P < 0.001) and underhand (P < 0.001) techniques. No statistical difference was found for shot accuracy (P > 0.05). Kinematic and kinetic variables were not significantly correlated to shot accuracy or velocity across all shot types; however, when analysed independently, the lead foot horizontal impulse showed a negative correlation with underhand ball velocity (P = 0.042). This study identifies the technique with the highest ball velocity, defines kinematic and kinetic predictors related to ball velocity and provides information to coaches and athletes concerned with improving lacrosse shot performance.

  8. Application of radar chart array analysis to visualize effects of formulation variables on IgG1 particle formation as measured by multiple analytical techniques

    PubMed Central

    Kalonia, Cavan; Kumru, Ozan S.; Kim, Jae Hyun; Middaugh, C. Russell; Volkin, David B.

    2013-01-01

    This study presents a novel method to visualize protein aggregate and particle formation data to rapidly evaluate the effect of solution and stress conditions on the physical stability of an IgG1 monoclonal antibody (mAb). Radar chart arrays were designed so that hundreds of Microflow Digital Imaging (MFI) solution measurements, evaluating different mAb formulations under varying stresses, could be presented in a single figure with minimal loss of data resolution. These MFI radar charts show measured changes in subvisible particle number, size and morphology distribution as a change in the shape of polygons. Radar charts were also created to visualize mAb aggregate and particle formation across a wide size range by combining data sets from size exclusion chromatography (SEC), Archimedes resonant mass measurements, and MFI. We found that the environmental/mechanical stress condition (e.g., heat vs. agitation) was the most important factor in influencing the particle size and morphology distribution with this IgG1 mAb. Additionally, the presence of NaCl exhibited a pH and stress dependent behavior resulting in promotion or inhibition mAb particle formation. This data visualization technique provides a comprehensive analysis of the aggregation tendencies of this IgG1 mAb in different formulations with varying stresses as measured by different analytical techniques. PMID:24122556

  9. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  10. A simple 2D composite image analysis technique for the crystal growth study of L-ascorbic acid.

    PubMed

    Kumar, Krishan; Kumar, Virender; Lal, Jatin; Kaur, Harmeet; Singh, Jasbir

    2017-06-01

    This work was destined for 2D crystal growth studies of L-ascorbic acid using the composite image analysis technique. Growth experiments on the L-ascorbic acid crystals were carried out by standard (optical) microscopy, laser diffraction analysis, and composite image analysis. For image analysis, the growth of L-ascorbic acid crystals was captured as digital 2D RGB images, which were then processed to composite images. After processing, the crystal boundaries emerged as white lines against the black (cancelled) background. The crystal boundaries were well differentiated by peaks in the intensity graphs generated for the composite images. The lengths of crystal boundaries measured from the intensity graphs of composite images were in good agreement (correlation coefficient "r" = 0.99) with the lengths measured by standard microscopy. On the contrary, the lengths measured by laser diffraction were poorly correlated with both techniques. Therefore, the composite image analysis can replace the standard microscopy technique for the crystal growth studies of L-ascorbic acid. © 2017 Wiley Periodicals, Inc.

  11. Cost-effectiveness analysis: adding value to assessment of animal health welfare and production.

    PubMed

    Babo Martins, S; Rushton, J

    2014-12-01

    Cost-effectiveness analysis (CEA) has been extensively used in economic assessments in fields related to animal health, namely in human health where it provides a decision-making framework for choices about the allocation of healthcare resources. Conversely, in animal health, cost-benefit analysis has been the preferred tool for economic analysis. In this paper, the use of CEA in related areas and the role of this technique in assessments of animal health, welfare and production are reviewed. Cost-effectiveness analysis can add further value to these assessments, particularly in programmes targeting animal welfare or animal diseases with an impact on human health, where outcomes are best valued in natural effects rather than in monetary units. Importantly, CEA can be performed during programme implementation stages to assess alternative courses of action in real time.

  12. Adaptive subdomain modeling: A multi-analysis technique for ocean circulation models

    NASA Astrophysics Data System (ADS)

    Altuntas, Alper; Baugh, John

    2017-07-01

    Many coastal and ocean processes of interest operate over large temporal and geographical scales and require a substantial amount of computational resources, particularly when engineering design and failure scenarios are also considered. This study presents an adaptive multi-analysis technique that improves the efficiency of these computations when multiple alternatives are being simulated. The technique, called adaptive subdomain modeling, concurrently analyzes any number of child domains, with each instance corresponding to a unique design or failure scenario, in addition to a full-scale parent domain providing the boundary conditions for its children. To contain the altered hydrodynamics originating from the modifications, the spatial extent of each child domain is adaptively adjusted during runtime depending on the response of the model. The technique is incorporated in ADCIRC++, a re-implementation of the popular ADCIRC ocean circulation model with an updated software architecture designed to facilitate this adaptive behavior and to utilize concurrent executions of multiple domains. The results of our case studies confirm that the method substantially reduces computational effort while maintaining accuracy.

  13. Constraint treatment techniques and parallel algorithms for multibody dynamic analysis. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chiou, Jin-Chern

    1990-01-01

    Computational procedures for kinematic and dynamic analysis of three-dimensional multibody dynamic (MBD) systems are developed from the differential-algebraic equations (DAE's) viewpoint. Constraint violations during the time integration process are minimized and penalty constraint stabilization techniques and partitioning schemes are developed. The governing equations of motion, a two-stage staggered explicit-implicit numerical algorithm, are treated which takes advantage of a partitioned solution procedure. A robust and parallelizable integration algorithm is developed. This algorithm uses a two-stage staggered central difference algorithm to integrate the translational coordinates and the angular velocities. The angular orientations of bodies in MBD systems are then obtained by using an implicit algorithm via the kinematic relationship between Euler parameters and angular velocities. It is shown that the combination of the present solution procedures yields a computationally more accurate solution. To speed up the computational procedures, parallel implementation of the present constraint treatment techniques, the two-stage staggered explicit-implicit numerical algorithm was efficiently carried out. The DAE's and the constraint treatment techniques were transformed into arrowhead matrices to which Schur complement form was derived. By fully exploiting the sparse matrix structural analysis techniques, a parallel preconditioned conjugate gradient numerical algorithm is used to solve the systems equations written in Schur complement form. A software testbed was designed and implemented in both sequential and parallel computers. This testbed was used to demonstrate the robustness and efficiency of the constraint treatment techniques, the accuracy of the two-stage staggered explicit-implicit numerical algorithm, and the speed up of the Schur-complement-based parallel preconditioned conjugate gradient algorithm on a parallel computer.

  14. Effect of STOP technique on safety climate in a construction company.

    PubMed

    Darvishi, Ebrahim; Maleki, Afshin; Dehestaniathar, Saeed; Ebrahemzadih, Mehrzad

    2015-01-01

    Safety programs are a core part of safety management in workplaces that can reduce incidents and injuries. The aim of this study was to investigate the influence of Safety Training Observation Program (STOP) technique as a behavior modification program on safety climate in a construction company. This cross-sectional study was carried out on workers of the Petrochemical Construction Company, western Iran. In order to improve safety climate, an unsafe behavior modification program entitled STOP was launched among workers of project during 12 months from April 2013 and April 2014. The STOP technique effectiveness in creating a positive safety climate was evaluated using the Safety Climate Assessment Toolkit. 76.78% of total behaviors were unsafe. 54.76% of total unsafe acts/ at-risk behaviors were related to the fall hazard. The most cause of unsafe behaviors was associated with habit and unavailability of safety equipment. After 12 month of continuous implementation the STOP technique, 55.8% of unsafe behaviors reduced among workers. The average score of safety climate evaluated using of the Toolkit, before and after the implementation of the STOP technique was 5.77 and 7.24, respectively. The STOP technique can be considered as effective approach for eliminating at-risk behavior, reinforcing safe work practices, and creating a positive safety climate in order to reduction incidents/injuries.

  15. Web image retrieval using an effective topic and content-based technique

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Cheng; Prabhakara, Rashmi

    2005-03-01

    There has been an exponential growth in the amount of image data that is available on the World Wide Web since the early development of Internet. With such a large amount of information and image available and its usefulness, an effective image retrieval system is thus greatly needed. In this paper, we present an effective approach with both image matching and indexing techniques that improvise on existing integrated image retrieval methods. This technique follows a two-phase approach, integrating query by topic and query by example specification methods. In the first phase, The topic-based image retrieval is performed by using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. This technique consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. In the second phase, we use query by example specification to perform a low-level content-based image match in order to retrieve smaller and relatively closer results of the example image. From this, information related to the image feature is automatically extracted from the query image. The main objective of our approach is to develop a functional image search and indexing technique and to demonstrate that better retrieval results can be achieved.

  16. Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach

    ERIC Educational Resources Information Center

    Lending, Diane; May, Jeffrey

    2013-01-01

    Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…

  17. Decomposition techniques

    USGS Publications Warehouse

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  18. Development of High Speed Imaging and Analysis Techniques Compressible Dynamics Stall

    NASA Technical Reports Server (NTRS)

    Chandrasekhara, M. S.; Carr, L. W.; Wilder, M. C.; Davis, Sanford S. (Technical Monitor)

    1996-01-01

    Dynamic stall has limited the flight envelope of helicopters for many years. The problem has been studied in the laboratory as well as in flight, but most research, even in the laboratory, has been restricted to surface measurement techniques such as pressure transducers or skin friction gauges, except at low speed. From this research, it became apparent that flow visualization tests performed at Mach numbers representing actual flight conditions were needed if the complex physics associated with dynamic stall was to be properly understood. However, visualization of the flow field during compressible conditions required carefully aligned and meticulously reconstructed holographic interferometry. As part of a long-range effort focused on exposing of the physics of compressible dynamic stall, a research wind tunnel was developed at NASA Ames Research Center which permits visual access to the full flow field surrounding an oscillating airfoil during compressible dynamic stall. Initially, a stroboscopic schlieren technique was used for visualization of the stall process, but the primary research tool has been point diffraction interferometry(PDI), a technique carefully optimized for use in th is project. A review of the process of development of PDI will be presented in the full paper. One of the most valuable aspects of PDI is the fact that interferograms are produced in real time on a continuous basis. The use of a rapidly-pulsed laser makes this practical; a discussion of this approach will be presented in the full paper. This rapid pulsing(up to 40,000 pulses/sec) produces interferograms of the rapidly developing dynamic stall field in sufficient resolution(both in space and time) that the fluid physics of the compressible dynamic stall flowfield can be quantitatively determined, including the gradients of pressure in space and time. This permits analysis of the influence of the effect of pitch rate, Mach number, Reynolds number, amplitude of oscillation, and other

  19. The association of placenta previa and assisted reproductive techniques: a meta-analysis.

    PubMed

    Karami, Manoochehr; Jenabi, Ensiyeh; Fereidooni, Bita

    2018-07-01

    Several epidemiological studies have determined that assisted reproductive techniques (ART) can increase the risk of placenta previa. To date, only a meta-analysis has been performed for assessing the relationship between placenta previa and ART. This meta-analysis was conducted to estimate the association between placenta previa and ART in singleton and twin pregnancies. A literature search was performed in major databases PubMed, Web of Science, and Scopus from the earliest possible year to April 2017. The heterogeneity across studies was explored by Q-test and I 2 statistic. The publication bias was assessed using Begg's and Egger's tests. The results were reported using odds ratio (OR) and relative risk (RR) estimates with its 95% confidence intervals (CI) using a random-effects model. The literature search yielded 1529 publications until September 2016 with 1,388,592 participants. The overall estimate of OR was 2.67 (95%CI: 2.01, 3.34) and RR was 3.62 (95%CI: 0.21, 7.03) based on singleton pregnancies. The overall estimate of OR was 1.50 (95%CI: 1.26, 1.74) based on twin pregnancies. We showed based on odds ratio reports in observational studies that ART procedures are a risk factor for placenta previa.

  20. Get to Understand More from Single-Cells: Current Studies of Microfluidic-Based Techniques for Single-Cell Analysis

    PubMed Central

    Lo, Shih-Jie; Yao, Da-Jeng

    2015-01-01

    This review describes the microfluidic techniques developed for the analysis of a single cell. The characteristics of microfluidic (e.g., little sample amount required, high-throughput performance) make this tool suitable to answer and to solve biological questions of interest about a single cell. This review aims to introduce microfluidic related techniques for the isolation, trapping and manipulation of a single cell. The major approaches for detection in single-cell analysis are introduced; the applications of single-cell analysis are then summarized. The review concludes with discussions of the future directions and opportunities of microfluidic systems applied in analysis of a single cell. PMID:26213918

  1. Analysis of Learning Curve Fitting Techniques.

    DTIC Science & Technology

    1987-09-01

    1986. 15. Neter, John and others. Applied Linear Regression Models. Homewood IL: Irwin, 19-33. 16. SAS User’s Guide: Basics, Version 5 Edition. SAS... Linear Regression Techniques (15:23-52). Random errors are assumed to be normally distributed when using -# ordinary least-squares, according to Johnston...lot estimated by the improvement curve formula. For a more detailed explanation of the ordinary least-squares technique, see Neter, et. al., Applied

  2. Analytical techniques and instrumentation: A compilation. [analytical instrumentation, materials performance, and systems analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Technical information is presented covering the areas of: (1) analytical instrumentation useful in the analysis of physical phenomena; (2) analytical techniques used to determine the performance of materials; and (3) systems and component analyses for design and quality control.

  3. Multivariate reference technique for quantitative analysis of fiber-optic tissue Raman spectroscopy.

    PubMed

    Bergholt, Mads Sylvest; Duraipandian, Shiyamala; Zheng, Wei; Huang, Zhiwei

    2013-12-03

    We report a novel method making use of multivariate reference signals of fused silica and sapphire Raman signals generated from a ball-lens fiber-optic Raman probe for quantitative analysis of in vivo tissue Raman measurements in real time. Partial least-squares (PLS) regression modeling is applied to extract the characteristic internal reference Raman signals (e.g., shoulder of the prominent fused silica boson peak (~130 cm(-1)); distinct sapphire ball-lens peaks (380, 417, 646, and 751 cm(-1))) from the ball-lens fiber-optic Raman probe for quantitative analysis of fiber-optic Raman spectroscopy. To evaluate the analytical value of this novel multivariate reference technique, a rapid Raman spectroscopy system coupled with a ball-lens fiber-optic Raman probe is used for in vivo oral tissue Raman measurements (n = 25 subjects) under 785 nm laser excitation powers ranging from 5 to 65 mW. An accurate linear relationship (R(2) = 0.981) with a root-mean-square error of cross validation (RMSECV) of 2.5 mW can be obtained for predicting the laser excitation power changes based on a leave-one-subject-out cross-validation, which is superior to the normal univariate reference method (RMSE = 6.2 mW). A root-mean-square error of prediction (RMSEP) of 2.4 mW (R(2) = 0.985) can also be achieved for laser power prediction in real time when we applied the multivariate method independently on the five new subjects (n = 166 spectra). We further apply the multivariate reference technique for quantitative analysis of gelatin tissue phantoms that gives rise to an RMSEP of ~2.0% (R(2) = 0.998) independent of laser excitation power variations. This work demonstrates that multivariate reference technique can be advantageously used to monitor and correct the variations of laser excitation power and fiber coupling efficiency in situ for standardizing the tissue Raman intensity to realize quantitative analysis of tissue Raman measurements in vivo, which is particularly appealing in

  4. Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs

    DOE PAGES

    Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo; ...

    2015-12-17

    Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less

  5. Analyzing the effectiveness of a frame-level redundancy scrubbing technique for SRAM-based FPGAs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tonfat, Jorge; Lima Kastensmidt, Fernanda; Rech, Paolo

    Radiation effects such as soft errors are the major threat to the reliability of SRAM-based FPGAs. This work analyzes the effectiveness in correcting soft errors of a novel scrubbing technique using internal frame redundancy called Frame-level Redundancy Scrubbing (FLR-scrubbing). This correction technique can be implemented in a coarse grain TMR design. The FLR-scrubbing technique was implemented on a mid-size Xilinx Virtex-5 FPGA device used as a case study. The FLR-scrubbing technique was tested under neutron radiation and fault injection. Implementation results demonstrated minimum area and energy consumption overhead when compared to other techniques. The time to repair the fault ismore » also improved by using the Internal Configuration Access Port (ICAP). Lastly, neutron radiation test results demonstrated that the proposed technique is suitable for correcting accumulated SEUs and MBUs.« less

  6. Stereotactic Radiofrequency Ablation (SRFA) of Liver Lesions: Technique Effectiveness, Safety, and Interoperator Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Widmann, Gerlig, E-mail: gerlig.widmann@i-med.ac.at; Schullian, Peter, E-mail: peter.schullian@i-med.ac.at; Haidu, Marion, E-mail: marion.haidu@i-med.ac.at

    Purpose: To evaluate technique effectiveness, safety, and interoperator performance of stereotactic radiofrequency ablation (SRFA) of liver lesions. Methods: Retrospective review including 90 consecutive patients from January 2008 to January 2010 with 106 computed tomography-guided SRFA sessions using both single and multiple electrodes for the treatment of 177 lesions: 72 hepatocellular carcinoma (HCC) and 105 metastases with a mean size of 2.9 cm (range 0.5-11 cm). Technique effectiveness and 1-year local recurrence were evaluated by computed tomographic scans. Complications, mortality, and hospital days were recorded. The performance between an experienced and inexperienced interventional radiologist was compared. Results: The overall technique effectivenessmore » after a single SRFA was 95.5% (93.1% for HCC and 97.1% for metastases). Four of the eight unsuccessfully treated lesions could be retreated (secondary technique effectiveness of 97.7%). Local recurrence at 1 year was 2.9%. Technique effectiveness was significantly different for lesions <5 cm (96.7%) and >5 cm (87.5%) (P = 0.044) but not for lesions <3 cm (95.9%) and 3-5 cm (100%). Compared to clear parenchymal property (97.3%), vessel vicinity (93.3%) (P = 0.349) and subcapsular (95.2%) (P = 0.532) had no, but hollow viscera vicinity (83.3%) had a significantly lower technique effectiveness (P = 0.020). Mortality rate was 0.9%. Major complications and hospital days were higher for cirrhosis Child-Pugh B (20%, 7.2 days) than Child-Pugh A (3.1%, 4.7 days) patients and for metastases (5.1%, 4.3 days). There was no significant difference in interoperator performance. Conclusions: RFA allowed for efficient, reliable, and safe ablation of large-volume liver disease.« less

  7. Microphotographs of cyanobacteria documenting the effects of various cell-lysis techniques

    USGS Publications Warehouse

    Rosen, Barry H.; Loftin, Keith A.; Smith, Christopher E.; Lane, Rachael F.; Keydel, Susan P.

    2011-01-01

    Cyanotoxins are a group of organic compounds biosynthesized intracellularly by many species of cyanobacteria found in surface water. The United States Environmental Protection Agency has listed cyanotoxins on the Safe Drinking Water Act's Contaminant Candidate List 3 for consideration for future regulation to protect public health. Cyanotoxins also pose a risk to humans and other organisms in a variety of other exposure scenarios. Accurate and precise analytical measurements of cyanotoxins are critical to the evaluation of concentrations in surface water to address the human health and ecosystem effects. A common approach to total cyanotoxin measurement involves cell membrane disruption to release the cyanotoxins to the dissolved phase followed by filtration to remove cellular debris. Several methods have been used historically, however no standard protocols exist to ensure this process is consistent between laboratories before the dissolved phase is measured by an analytical technique for cyanotoxin identification and quantitation. No systematic evaluation has been conducted comparing the multiple laboratory sample processing techniques for physical disruption of cell membrane or cyanotoxins recovery. Surface water samples collected from lakes, reservoirs, and rivers containing mixed assemblages of organisms dominated by cyanobacteria, as well as laboratory cultures of species-specific cyanobacteria, were used as part of this study evaluating multiple laboratory cell-lysis techniques in partnership with the U.S. Environmental Protection Agency. Evaluated extraction techniques included boiling, autoclaving, sonication, chemical treatment, and freeze-thaw. Both treated and untreated samples were evaluated for cell membrane integrity microscopically via light, epifluorescence, and epifluorescence in the presence of a DNA stain. The DNA stain, which does not permeate live cells with intact membrane structures, was used as an indicator for cyanotoxin release into the

  8. [Analysis on the long-term effects of modified double endobutton technique in the treatment of Tossy type III acromioclavicular joint dislocations].

    PubMed

    Yan, Rui-Jian; Lu, Jian-Wei; Zhang, Chun

    2014-01-01

    To investigate the long-term clinical effects of modified double Endobutton technique for the treatment of acromioclavicular joint dislocations of Tossy type III. A retrospective study was done in 42 patients with acromioclavicular joint dislocations of Tossy type III treated with modified double Endobutton technique from December 2008 to December 2010. There were 24 males and 18 females, ranging in age from 21 to 56 years old (averaged, 32.5 years old). All the patients were treated with open reduction, coracoclavicular ligament reconstruction using double Endobutton technique, and repair of acromioclavicular ligament. The Karlsson system was used to evaluate therapeutic effects. The distance from coracoid to clavicle was measured to evaluate reduction loss. All the patients were followed up, and the duration ranged from 2.0 to 3.2 years (averaged,2.4 years). According to Karlsson system, 32 patients got an A degree and 10 patients got a B degree at three months post-operatively; 26 patients got an A degree and 16 patients got a B degree at the latest follow-up; 6 patients got an A degree at 3 months after operation lowered to B degree at the latest follow-up. The coracoid-clavicle distance increased from (26.91 +/- 0.91) mm at 3 months after operation to (27.41 +/- 1.10) mm at the latest follow-up. Te patients treated with over-reduction during operation or with heavy physical labour work after operation had obvious widened coracoid-clavicle distance. Bone absorption was found around the plate in most cases, mainly in the clavicular side. Treatment for acromioclavicular joint dislocations of Tossy type III with modified double Endobutton technique has satisfactory early clinical results. But with time passing, loss of reduction and bone absorption around the plate could be observed, and clinical outcomes of some cases downgrade during the long-term follow-up.

  9. Effectiveness of teaching cognitive-behavioral techniques on locus of control in hemodialysis patients.

    PubMed

    Mehrtak, Mohammad; Habibzadeh, Shahram; Farzaneh, Esmaeil; Rjaei-Khiavi, Abdollah

    2017-10-01

    Many of the cognitive behavioral models and therapeutic protocols developed so far for psychological disorders and chronic diseases have proved effective through clinical research. This study aimed to determine the effectiveness of teaching cognitive-behavioral techniques on locus of control in hemodialysis patients. This controlled clinical trial study was conducted in 2015 with 76 patients selected by census and treated with a hemodialysis machine in the dialysis department of Vali-Asr Hospital in the city of Meshkinshahr. A total of four patients were excluded because of their critical conditions while the rest, who were recruited, were randomly divided into two equal groups of 36 patients as the intervention and control groups. First, the locus of control was measured in both groups through a pretest, and cognitive-behavioral techniques were then taught to the intervention group during eight 45 to 90-minute sessions. The locus of control in patients of both groups was finally re-measured through a posttest. Data were collected using Rotter's Locus of Control Inventory. The Wilcoxon test and Mann-Whitney U test were respectively used in SPSS18 for data analysis. In the pretest and posttest stages respectively, 4.8% and 14.3% of samples in the control group as well as 14.3% and 33.3% of samples in the intervention group enjoyed internal locus of control. The difference between the pretest and posttest scores of internal locus of control in the intervention group was significant (p=0.004), which indicates the positive effect of cognitive-behavioral psychotherapeutic intervention on internalization of locus of control in this group. Given the external locus of control in most of the study patients and also the positive significant effect of cognitive-behavioral psychotherapy on internalization of locus of control in this group of patients, it appears necessary to have a psychology resident present in the hemodialysis department to teach the necessary cognitive

  10. Effectiveness of teaching cognitive-behavioral techniques on locus of control in hemodialysis patients

    PubMed Central

    Mehrtak, Mohammad; Habibzadeh, Shahram; Farzaneh, Esmaeil; Rjaei-Khiavi, Abdollah

    2017-01-01

    Background Many of the cognitive behavioral models and therapeutic protocols developed so far for psychological disorders and chronic diseases have proved effective through clinical research. Objective This study aimed to determine the effectiveness of teaching cognitive-behavioral techniques on locus of control in hemodialysis patients. Methods This controlled clinical trial study was conducted in 2015 with 76 patients selected by census and treated with a hemodialysis machine in the dialysis department of Vali-Asr Hospital in the city of Meshkinshahr. A total of four patients were excluded because of their critical conditions while the rest, who were recruited, were randomly divided into two equal groups of 36 patients as the intervention and control groups. First, the locus of control was measured in both groups through a pretest, and cognitive-behavioral techniques were then taught to the intervention group during eight 45 to 90-minute sessions. The locus of control in patients of both groups was finally re-measured through a posttest. Data were collected using Rotter’s Locus of Control Inventory. The Wilcoxon test and Mann–Whitney U test were respectively used in SPSS18 for data analysis. Results In the pretest and posttest stages respectively, 4.8% and 14.3% of samples in the control group as well as 14.3% and 33.3% of samples in the intervention group enjoyed internal locus of control. The difference between the pretest and posttest scores of internal locus of control in the intervention group was significant (p=0.004), which indicates the positive effect of cognitive-behavioral psychotherapeutic intervention on internalization of locus of control in this group. Conclusions Given the external locus of control in most of the study patients and also the positive significant effect of cognitive-behavioral psychotherapy on internalization of locus of control in this group of patients, it appears necessary to have a psychology resident present in the

  11. The applicability and effectiveness of cluster analysis

    NASA Technical Reports Server (NTRS)

    Ingram, D. S.; Actkinson, A. L.

    1973-01-01

    An insight into the characteristics which determine the performance of a clustering algorithm is presented. In order for the techniques which are examined to accurately cluster data, two conditions must be simultaneously satisfied. First the data must have a particular structure, and second the parameters chosen for the clustering algorithm must be correct. By examining the structure of the data from the Cl flight line, it is clear that no single set of parameters can be used to accurately cluster all the different crops. The effectiveness of either a noniterative or iterative clustering algorithm to accurately cluster data representative of the Cl flight line is questionable. Thus extensive a prior knowledge is required in order to use cluster analysis in its present form for applications like assisting in the definition of field boundaries and evaluating the homogeneity of a field. New or modified techniques are necessary for clustering to be a reliable tool.

  12. Transit Spectroscopy: new data analysis techniques and interpretation

    NASA Astrophysics Data System (ADS)

    Tinetti, Giovanna; Waldmann, Ingo P.; Morello, Giuseppe; Tessenyi, Marcell; Varley, Ryan; Barton, Emma; Yurchenko, Sergey; Tennyson, Jonathan; Hollis, Morgan

    2014-11-01

    Planetary science beyond the boundaries of our Solar System is today in its infancy. Until a couple of decades ago, the detailed investigation of the planetary properties was restricted to objects orbiting inside the Kuiper Belt. Today, we cannot ignore that the number of known planets has increased by two orders of magnitude nor that these planets resemble anything but the objects present in our own Solar System. A key observable for planets is the chemical composition and state of their atmosphere. To date, two methods can be used to sound exoplanetary atmospheres: transit and eclipse spectroscopy, and direct imaging spectroscopy. Although the field of exoplanet spectroscopy has been very successful in past years, there are a few serious hurdles that need to be overcome to progress in this area: in particular instrument systematics are often difficult to disentangle from the signal, data are sparse and often not recorded simultaneously causing degeneracy of interpretation. We will present here new data analysis techniques and interpretation developed by the “ExoLights” team at UCL to address the above-mentioned issues. Said techniques include statistical tools, non-parametric, machine-learning algorithms, optimized radiative transfer models and spectroscopic line-lists. These new tools have been successfully applied to existing data recorded with space and ground instruments, shedding new light on our knowledge and understanding of these alien worlds.

  13. Failure Analysis by Statistical Techniques (FAST). Volume 1. User’s Manual

    DTIC Science & Technology

    1974-10-31

    REPORT NUMBER DNA 3336F-1 2. OOVT ACCESSION NO 4. TITLE Cand Sublllle) • FAILURE ANALYSIS BY STATISTICAL TECHNIQUES (FAST) Volume I, User’s...SS2), and t’ a facility ( SS7 ). The other three diagrams break down the three critical subsystems. T le median probability of survival of the

  14. Applying spectral data analysis techniques to aquifer monitoring data in Belvoir Ranch, Wyoming

    NASA Astrophysics Data System (ADS)

    Gao, F.; He, S.; Zhang, Y.

    2017-12-01

    This study uses spectral data analysis techniques to estimate the hydraulic parameters from water level fluctuation due to tide effect and barometric effect. All water level data used in this study are collected in Belvoir Ranch, Wyoming. Tide effect can be not only observed in coastal areas, but also in inland confined aquifers. The force caused by changing positions of sun and moon affects not only ocean but also solid earth. The tide effect has an oscillatory pumping or injection sequence to the aquifer, and can be observed from dense water level monitoring. Belvoir Ranch data are collected once per hour, thus is dense enough to capture the tide effect. First, transforming de-trended data from temporal domain to frequency domain with Fourier transform method. Then, the storage coefficient can be estimated using Bredehoeft-Jacob model. After this, analyze the gain function, which expresses the amplification and attenuation of the output signal, and derive barometric efficiency. Next, find effective porosity with storage coefficient and barometric efficiency with Jacob's model. Finally, estimate aquifer transmissivity and hydraulic conductivity using Paul Hsieh's method. The estimated hydraulic parameters are compared with those from traditional pumping data estimation. This study proves that hydraulic parameter can be estimated by only analyze water level data in frequency domain. It has the advantages of low cost and environmental friendly, thus should be considered for future use of hydraulic parameter estimations.

  15. Analysis of the role of diffraction in topographic site effects using boundary element techniques

    NASA Astrophysics Data System (ADS)

    Gomez, Juan; Restrepo, Doriam; Jaramillo, Juan; Valencia, Camilo

    2013-10-01

    The role played by the diffraction field on the problem of seismic site effects is studied. For that purpose we solve and analyze simple scattering problems under P and SV in-plane wave assumptions, using two well known direct boundary-element-based numerical methods. After establishing the difference between scattered and diffracted motions, and introducing the concept of artificious and physically based incoming fields, we obtain the amplitude of the Fourier spectra for the diffracted part of the response: this is achieved after establishing the connection between the spatial distribution of the transfer function over the studied simple topographies and the diffracted field. From the numerical simulations it is observed that this diffracted part of the response is responsible for the amplification of the surface ground motions due to the geometric effect. Furthermore, it is also found that the diffraction field sets in a fingerprint of the topographic effect in the total ground motions. These conclusions are further supported by observations in the time-domain in terms of snapshots of the propagation patterns over the complete computational model. In this sense the geometric singularities are clearly identified as sources of diffraction and for the considered range of dimensionless frequencies it is evident that larger amplifications are obtained for the geometries containing a larger number of diffraction sources thus resulting in a stronger topographic effect. The need for closed-form solutions of canonical problems to construct a robust analysis method based on the diffraction field is identified.

  16. Infrared spectroscopy as a screening technique for colitis

    NASA Astrophysics Data System (ADS)

    Titus, Jitto; Ghimire, Hemendra; Viennois, Emilie; Merlin, Didier; Perera, A. G. Unil

    2017-05-01

    There remains a great need for diagnosis of inflammatory bowel disease (IBD), for which the current technique, colonoscopy, is not cost-effective and presents a non-negligible risk for complications. Attenuated Total Reflectance Fourier Transform Infrared (ATR-FTIR) spectroscopy is a new screening technique to evaluate colitis. Comparing infrared spectra of sera to study the differences between them can prove challenging due to the complexity of its biological constituents giving rise to a plethora of vibrational modes. Overcoming these inherent infrared spectral analysis difficulties involving highly overlapping absorbance peaks and the analysis of the data by curve fitting to improve the resolution is discussed. The proposed technique uses colitic and normal wild type mice dried serum to obtain ATR/FTIR spectra to effectively differentiate colitic mice from normal mice. Using this method, Amide I group frequency (specifically, alpha helix to beta sheet ratio of the protein secondary structure) was identified as disease associated spectral signature in addition to the previously reported glucose and mannose signatures in sera of chronic and acute mice models of colitis. Hence, this technique will be able to identify changes in the sera due to various diseases.

  17. Comparative forensic soil analysis of New Jersey state parks using a combination of simple techniques with multivariate statistics.

    PubMed

    Bonetti, Jennifer; Quarino, Lawrence

    2014-05-01

    This study has shown that the combination of simple techniques with the use of multivariate statistics offers the potential for the comparative analysis of soil samples. Five samples were obtained from each of twelve state parks across New Jersey in both the summer and fall seasons. Each sample was examined using particle-size distribution, pH analysis in both water and 1 M CaCl2 , and a loss on ignition technique. Data from each of the techniques were combined, and principal component analysis (PCA) and canonical discriminant analysis (CDA) were used for multivariate data transformation. Samples from different locations could be visually differentiated from one another using these multivariate plots. Hold-one-out cross-validation analysis showed error rates as low as 3.33%. Ten blind study samples were analyzed resulting in no misclassifications using Mahalanobis distance calculations and visual examinations of multivariate plots. Seasonal variation was minimal between corresponding samples, suggesting potential success in forensic applications. © 2014 American Academy of Forensic Sciences.

  18. Effect of dimethyl sulfoxide wet-bonding technique on hybrid layer quality and dentin bond strength.

    PubMed

    Stape, Thiago Henrique Scarabello; Tjäderhane, Leo; Marques, Marcelo Rocha; Aguiar, Flávio Henrique Baggio; Martins, Luís Roberto Marcondes

    2015-06-01

    This study examined the effect of a dimethyl sulfoxide (DMSO) wet bonding technique on the resin infiltration depths at the bonded interface and dentin bond strength of different adhesive systems. Flat dentin surfaces of 48 human third molars were treated with 50% DMSO (experimental groups) or with distilled water (controls) before bonding using an etch-and-rinse (SBMP: Scotchbond Multi-Purpose, 3M ESPE) or a self-etch (Clearfil: Clearfil SE Bond, Kuraray) adhesive system. The restored crown segments (n=12/group) were stored in distilled water (24h) and sectioned for interfacial analysis of exposed collagen using Masson's Trichrome staining and for microtensile bond strength testing. The extent of exposed collagen was measured using light microscopy and a histometric analysis software. Failure modes were examined by SEM. Data was analyzed by two-way ANOVA followed by Tukey Test (α=0.05). The interaction of bonding protocol and adhesive system had significant effects on the extension of exposed collagen matrix (p<0.0001) and bond strength (p=0.0091). DMSO-wet bonding significantly reduced the extent of exposed collagen matrix for SBMP and Clearfil (p<0.05). Significant increase in dentin bond strength was observed on DMSO-treated specimens bonded with SBMP (p<0.05), while no differences were observed for Clearfil (p>0.05). DMSO-wet bonding was effective to improve the quality of resin-dentin bonds of the tested etch-and-rinse adhesives by reducing the extent of exposed collagen matrix at the base of the resin-dentin biopolymer. The improved penetration of adhesive monomers is reflected as an increase in the immediate bond strength when the DMSO-wet bonding technique is used with a water-based etch-and-rinse adhesive. Copyright © 2015 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

  19. Reproducibility of Centric Relation Techniques by means of Condyle Position Analysis

    PubMed Central

    Galeković, Nikolina Holen; Fugošić, Vesna; Braut, Vedrana

    2017-01-01

    Purpose The aim of this study was to determine the reproducibility of clinical centric relation (CR) registration techniques (bimanual manipulation, chin point guidance and Roth's method) by means of condyle position analysis. Material and methods Thirty two fully dentate asymptomatic subjects (16 female and 16 male) with normal occlusal relations (Angle class I) participated in the study (mean age, 22.6 ± 4.7 years). The mandibular position indicator (MPI) was used to analyze the three-dimensional (anteroposterior (ΔX), superoinferior (ΔZ), mediolateral (ΔY)) condylar shift generated by the difference between the centric relation position (CR) and the maximal intercuspation position (MI) observed in dental arches. Results The mean value and standard deviation of three-dimensional condylar shift of the tested clinical CR techniques was 0.19 ± 0.34 mm. Significant differences within the tested clinical CR registration techniques were found for anteroposterior condylar shift on the right side posterior (Δ Xrp; P ≤ 0.012); and superoinferior condylar shift on the left side inferior (Δ Zli; P ≤ 0.011), whereas between the tested CR registration techniques were found for anteroposterior shift on the right side posterior (ΔXrp, P ≤ 0.037) and superoinferior shift on the right side inferior (ΔZri, P ≤ 0.004), on the left side inferior (ΔZli, P ≤ 0.005) and on the left side superior (ΔZls, P ≤ 0.007). Conclusion Bimanual manipulation, chin point guidance and Roth's method are clinical CR registration techniques of equal accuracy and reproducibility in asymptomatic subjects with normal occlusal relationship. PMID:28740266

  20. Reproducibility of Centric Relation Techniques by means of Condyle Position Analysis.

    PubMed

    Galeković, Nikolina Holen; Fugošić, Vesna; Braut, Vedrana; Ćelić, Robert

    2017-03-01

    The aim of this study was to determine the reproducibility of clinical centric relation (CR) registration techniques (bimanual manipulation, chin point guidance and Roth's method) by means of condyle position analysis. Thirty two fully dentate asymptomatic subjects (16 female and 16 male) with normal occlusal relations (Angle class I) participated in the study (mean age, 22.6 ± 4.7 years). The mandibular position indicator (MPI) was used to analyze the three-dimensional (anteroposterior (ΔX), superoinferior (ΔZ), mediolateral (ΔY)) condylar shift generated by the difference between the centric relation position (CR) and the maximal intercuspation position (MI) observed in dental arches. The mean value and standard deviation of three-dimensional condylar shift of the tested clinical CR techniques was 0.19 ± 0.34 mm. Significant differences within the tested clinical CR registration techniques were found for anteroposterior condylar shift on the right side posterior (Δ Xrp; P ≤ 0.012); and superoinferior condylar shift on the left side inferior (Δ Zli; P ≤ 0.011), whereas between the tested CR registration techniques were found for anteroposterior shift on the right side posterior (ΔXrp, P ≤ 0.037) and superoinferior shift on the right side inferior (ΔZri, P ≤ 0.004), on the left side inferior (ΔZli, P ≤ 0.005) and on the left side superior (ΔZls, P ≤ 0.007). Bimanual manipulation, chin point guidance and Roth's method are clinical CR registration techniques of equal accuracy and reproducibility in asymptomatic subjects with normal occlusal relationship.

  1. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Technical Reports Server (NTRS)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  2. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    NASA Astrophysics Data System (ADS)

    Cull, R. C.; Eltimsahy, A. H.

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  3. Piggyback technique in adult orthotopic liver transplantation: an analysis of 1067 liver transplants at a single center

    PubMed Central

    Nakamura, Noboru; Vaidya, Anil; Levi, David M.; Kato, Tomoaki; Nery, Jose R.; Madariaga, Juan R.; Molina, Enrique; Ruiz, Phillip; Gyamfi, Anthony; Tzakis, Andreas G.

    2006-01-01

    Background. Orthotopic liver transplantation (OLT) in adult patients has traditionally been performed using conventional caval reconstruction technique (CV) with veno-venous bypass. Recently, the piggyback technique (PB) without veno-venous bypass has begun to be widely used. The aim of this study was to assess the effect of routine use of PB on OLTs in adult patients. Patients and methods. A retrospective analysis was undertaken of 1067 orthotopic cadaveric whole liver transplantations in adult patients treated between June 1994 and July 2001. PB was used as the routine procedure. Patient demographics, factors including cold ischemia time (CIT), warm ischemia time (WIT), operative time, transfusions, blood loss, and postoperative results were assessed. The effects of clinical factors on graft survival were assessed by univariate and multivariate analyses.In all, 918 transplantations (86%) were performed with PB. Blood transfusion, WIT, and usage of veno-venous bypass were less with PB. Seventy-five (8.3%) cases with PB had refractory ascites following OLT (p=NS). Five venous outflow stenosis cases (0.54%) with PB were noted (p=NS). The liver and renal function during the postoperative periods was similar. Overall 1-, 3-, and 5-year patient survival rates were 85%, 78%, and 72% with PB. Univariate analysis showed that cava reconstruction method, CIT, WIT, amount of transfusion, length of hospital stay, donor age, and tumor presence were significant factors influencing graft survival. Multivariate analysis further reinforced the fact that CIT, donor age, amount of transfusion, and hospital stay were prognostic factors for graft survival. Conclusions. PB can be performed safely in the majority of adult OLTs. Results of OLT with PB are as same as for CV. Liver function, renal function, morbidity, mortality, and patient and graft survival are similar to CV. However, amount of transfusion, WIT, and use of veno-venous bypass are less with PB. PMID:18333273

  4. Cost-effectiveness Analysis with Influence Diagrams.

    PubMed

    Arias, M; Díez, F J

    2015-01-01

    Cost-effectiveness analysis (CEA) is used increasingly in medicine to determine whether the health benefit of an intervention is worth the economic cost. Decision trees, the standard decision modeling technique for non-temporal domains, can only perform CEA for very small problems. To develop a method for CEA in problems involving several dozen variables. We explain how to build influence diagrams (IDs) that explicitly represent cost and effectiveness. We propose an algorithm for evaluating cost-effectiveness IDs directly, i.e., without expanding an equivalent decision tree. The evaluation of an ID returns a set of intervals for the willingness to pay - separated by cost-effectiveness thresholds - and, for each interval, the cost, the effectiveness, and the optimal intervention. The algorithm that evaluates the ID directly is in general much more efficient than the brute-force method, which is in turn more efficient than the expansion of an equivalent decision tree. Using OpenMarkov, an open-source software tool that implements this algorithm, we have been able to perform CEAs on several IDs whose equivalent decision trees contain millions of branches. IDs can perform CEA on large problems that cannot be analyzed with decision trees.

  5. Reliability analysis of a phaser measurement unit using a generalized fuzzy lambda-tau(GFLT) technique.

    PubMed

    Komal

    2018-05-01

    Nowadays power consumption is increasing day-by-day. To fulfill failure free power requirement, planning and implementation of an effective and reliable power management system is essential. Phasor measurement unit(PMU) is one of the key device in wide area measurement and control systems. The reliable performance of PMU assures failure free power supply for any power system. So, the purpose of the present study is to analyse the reliability of a PMU used for controllability and observability of power systems utilizing available uncertain data. In this paper, a generalized fuzzy lambda-tau (GFLT) technique has been proposed for this purpose. In GFLT, system components' uncertain failure and repair rates are fuzzified using fuzzy numbers having different shapes such as triangular, normal, cauchy, sharp gamma and trapezoidal. To select a suitable fuzzy number for quantifying data uncertainty, system experts' opinion have been considered. The GFLT technique applies fault tree, lambda-tau method, fuzzified data using different membership functions, alpha-cut based fuzzy arithmetic operations to compute some important reliability indices. Furthermore, in this study ranking of critical components of the system using RAM-Index and sensitivity analysis have also been performed. The developed technique may be helpful to improve system performance significantly and can be applied to analyse fuzzy reliability of other engineering systems. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Innovative Techniques Simplify Vibration Analysis

    NASA Technical Reports Server (NTRS)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  7. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  8. An objective isobaric/isentropic technique for upper air analysis

    NASA Technical Reports Server (NTRS)

    Mancuso, R. L.; Endlich, R. M.; Ehernberger, L. J.

    1981-01-01

    An objective meteorological analysis technique is presented whereby both horizontal and vertical upper air analyses are performed. The process used to interpolate grid-point values from the upper-air station data is the same as for grid points on both an isobaric surface and a vertical cross-sectional plane. The nearby data surrounding each grid point are used in the interpolation by means of an anisotropic weighting scheme, which is described. The interpolation for a grid-point potential temperature is performed isobarically; whereas wind, mixing-ratio, and pressure height values are interpolated from data that lie on the isentropic surface that passes through the grid point. Two versions (A and B) of the technique are evaluated by qualitatively comparing computer analyses with subjective handdrawn analyses. The objective products of version A generally have fair correspondence with the subjective analyses and with the station data, and depicted the structure of the upper fronts, tropopauses, and jet streams fairly well. The version B objective products correspond more closely to the subjective analyses, and show the same strong gradients across the upper front with only minor smoothing.

  9. Macro elemental analysis of food samples by nuclear analytical technique

    NASA Astrophysics Data System (ADS)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  10. Finite element analysis for edge-to-edge technique to treat post-mitral valve repair systolic anterior motion.

    PubMed

    Zhong, Qi; Zeng, Wenhua; Huang, Xiaoyang; Zhao, Xiaojia

    2014-01-01

    Systolic anterior motion of the mitral valve is an uncommon complication of mitral valve repair, which requires immediate supplementary surgical action. Edge-to-edge suture is considered as an effective technique to treat post-mitral valve repair systolic anterior motion by clinical researchers. However, the fundamentals and quantitative analysis are vacant to validate the effectiveness of the additional edge-to-edge surgery to repair systolic anterior motion. In the present work, finite element models were developed to simulate a specific clinical surgery for patients with posterior leaflet prolapse, so as to analyze the edge-to-edge technique quantificationally. The simulated surgery procedure concluded several actions such as quadrangular resection, mitral annuloplasty and edge-to-edge suture. And the simulated results were compared with echocardiography and measurement data of the patients under the mitral valve surgery, which shows good agreement. The leaflets model with additional edge-to-edge suture has a shorter mismatch length than that of the model merely under quadrangular resection and mitral annuloplasty actions at systole, which assures a better coaptation status. The stress on the leaflets after edge-to-edge suture is lessened as well.

  11. Evaluation of different screw fixation techniques and screw diameters in sagittal split ramus osteotomy: finite element analysis method.

    PubMed

    Sindel, A; Demiralp, S; Colok, G

    2014-09-01

    Sagittal split ramus osteotomy (SSRO) is used for correction of numerous congenital or acquired deformities in facial region. Several techniques have been developed and used to maintain fixation and stabilisation following SSRO application. In this study, the effects of the insertion formations of the bicortical different sized screws to the stresses generated by forces were studied. Three-dimensional finite elements analysis (FEA) and static linear analysis methods were used to investigate difference which would occur in terms of forces effecting onto the screws and transmitted to bone between different application areas. No significant difference was found between 1·5- and 2-mm screws used in SSRO fixation. Besides, it was found that 'inverted L' application was more successful compared to the others and that was followed by 'L' and 'linear' formations which showed close rates to each other. Few studies have investigated the effect of thickness and application areas of bicortical screws. This study was performed on both advanced and regressed jaws positions. © 2014 John Wiley & Sons Ltd.

  12. Computational techniques for ECG analysis and interpretation in light of their contribution to medical advances

    PubMed Central

    Mincholé, Ana; Martínez, Juan Pablo; Laguna, Pablo; Rodriguez, Blanca

    2018-01-01

    Widely developed for clinical screening, electrocardiogram (ECG) recordings capture the cardiac electrical activity from the body surface. ECG analysis can therefore be a crucial first step to help diagnose, understand and predict cardiovascular disorders responsible for 30% of deaths worldwide. Computational techniques, and more specifically machine learning techniques and computational modelling are powerful tools for classification, clustering and simulation, and they have recently been applied to address the analysis of medical data, especially ECG data. This review describes the computational methods in use for ECG analysis, with a focus on machine learning and 3D computer simulations, as well as their accuracy, clinical implications and contributions to medical advances. The first section focuses on heartbeat classification and the techniques developed to extract and classify abnormal from regular beats. The second section focuses on patient diagnosis from whole recordings, applied to different diseases. The third section presents real-time diagnosis and applications to wearable devices. The fourth section highlights the recent field of personalized ECG computer simulations and their interpretation. Finally, the discussion section outlines the challenges of ECG analysis and provides a critical assessment of the methods presented. The computational methods reported in this review are a strong asset for medical discoveries and their translation to the clinical world may lead to promising advances. PMID:29321268

  13. Comparison of three‐dimensional analysis and stereological techniques for quantifying lithium‐ion battery electrode microstructures

    PubMed Central

    TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.

    2016-01-01

    Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804

  14. Comparison of three-dimensional analysis and stereological techniques for quantifying lithium-ion battery electrode microstructures.

    PubMed

    Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R

    2016-09-01

    Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.

  15. Feathering effect detection and artifact agglomeration index-based video deinterlacing technique

    NASA Astrophysics Data System (ADS)

    Martins, André Luis; Rodrigues, Evandro Luis Linhari; de Paiva, Maria Stela Veludo

    2018-03-01

    Several video deinterlacing techniques have been developed, and each one presents a better performance in certain conditions. Occasionally, even the most modern deinterlacing techniques create frames with worse quality than primitive deinterlacing processes. This paper validates that the final image quality can be improved by combining different types of deinterlacing techniques. The proposed strategy is able to select between two types of deinterlaced frames and, if necessary, make the local correction of the defects. This decision is based on an artifact agglomeration index obtained from a feathering effect detection map. Starting from a deinterlaced frame produced by the "interfield average" method, the defective areas are identified, and, if deemed appropriate, these areas are replaced by pixels generated through the "edge-based line average" method. Test results have proven that the proposed technique is able to produce video frames with higher quality than applying a single deinterlacing technique through getting what is good from intra- and interfield methods.

  16. Effect of Jigsaw I Technique on Teaching Turkish Grammar

    ERIC Educational Resources Information Center

    Arslan, Akif

    2016-01-01

    The purpose of this study is to find out the effect of Jigsaw I technique on students' academic success and attitude towards the course in teaching Turkish grammar. For that purpose, three grammar topics (spelling and punctuation marks rules) were determined and an experimental study conforming to "control group preliminary-testing final…

  17. The effects of the Bowen technique on hamstring flexibility over time: a randomised controlled trial.

    PubMed

    Marr, Michelle; Baker, Julian; Lambon, Nicky; Perry, Jo

    2011-07-01

    The hamstring muscles are regularly implicated in recurrent injuries, movement dysfunction and low back pain. Links between limited flexibility and development of neuromusculoskeletal symptoms are frequently reported. The Bowen Technique is used to treat many conditions including lack of flexibility. The study set out to investigate the effect of the Bowen Technique on hamstring flexibility over time. An assessor-blind, prospective, randomised controlled trial was performed on 120 asymptomatic volunteers. Participants were randomly allocated into a control group or Bowen group. Three flexibility measurements occurred over one week, using an active knee extension test. The intervention group received a single Bowen treatment. A repeated measures univariate analysis of variance, across both groups for the three time periods, revealed significant within-subject and between-subject differences for the Bowen group. Continuing increases in flexibility levels were observed over one week. No significant change over time was noted for the control group. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Application of Computational Stability and Control Techniques Including Unsteady Aerodynamics and Aeroelastic Effects

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Edwards, John W.

    2004-01-01

    The motivation behind the inclusion of unsteady aerodynamics and aeroelastic effects in the computation of stability and control (S&C) derivatives will be discussed as they pertain to aeroelastic and aeroservoelastic analysis. This topic will be addressed in the context of two applications, the first being the estimation of S&C derivatives for a cable-mounted aeroservoelastic wind tunnel model tested in the NASA Langley Research Center (LaRC) Transonic Dynamics Tunnel (TDT). The second application will be the prediction of the nonlinear aeroservoelastic phenomenon known as Residual Pitch Oscillation (RPO) on the B-2 Bomber. Techniques and strategies used in these applications to compute S&C derivatives and perform flight simulations will be reviewed, and computational results will be presented.

  19. Computer program uses Monte Carlo techniques for statistical system performance analysis

    NASA Technical Reports Server (NTRS)

    Wohl, D. P.

    1967-01-01

    Computer program with Monte Carlo sampling techniques determines the effect of a component part of a unit upon the overall system performance. It utilizes the full statistics of the disturbances and misalignments of each component to provide unbiased results through simulated random sampling.

  20. Analytical techniques: A compilation

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A compilation, containing articles on a number of analytical techniques for quality control engineers and laboratory workers, is presented. Data cover techniques for testing electronic, mechanical, and optical systems, nondestructive testing techniques, and gas analysis techniques.

  1. Comparison of time-frequency distribution techniques for analysis of spinal somatosensory evoked potential.

    PubMed

    Hu, Y; Luk, K D; Lu, W W; Holmes, A; Leong, J C

    2001-05-01

    Spinal somatosensory evoked potential (SSEP) has been employed to monitor the integrity of the spinal cord during surgery. To detect both temporal and spectral changes in SSEP waveforms, an investigation of the application of time-frequency analysis (TFA) techniques was conducted. SSEP signals from 30 scoliosis patients were analysed using different techniques; short time Fourier transform (STFT), Wigner-Ville distribution (WVD), Choi-Williams distribution (CWD), cone-shaped distribution (CSD) and adaptive spectrogram (ADS). The time-frequency distributions (TFD) computed using these methods were assessed and compared with each other. WVD, ADS, CSD and CWD showed better resolution than STFT. Comparing normalised peak widths, CSD showed the sharpest peak width (0.13+/-0.1) in the frequency dimension, and a mean peak width of 0.70+/-0.12 in the time dimension. Both WVD and CWD produced cross-term interference, distorting the TFA distribution, but this was not seen with CSD and ADS. CSD appeared to give a lower mean peak power bias (10.3%+/-6.2%) than ADS (41.8%+/-19.6%). Application of the CSD algorithm showed both good resolution and accurate spectrograms, and is therefore recommended as the most appropriate TFA technique for the analysis of SSEP signals.

  2. The Scientific Status of Projective Techniques.

    PubMed

    Lilienfeld, S O; Wood, J M; Garb, H N

    2000-11-01

    Although projective techniques continue to be widely used in clinical and forensic settings, their scientific status remains highly controversial. In this monograph, we review the current state of the literature concerning the psychometric properties (norms, reliability, validity, incremental validity, treatment utility) of three major projective instruments: Rorschach Inkblot Test, Thematic Apperception Test (TAT), and human figure drawings. We conclude that there is empirical support for the validity of a small number of indexes derived from the Rorschach and TAT. However, the substantial majority of Rorschach and TAT indexes are not empirically supported. The validity evidence for human figure drawings is even more limited. With a few exceptions, projective indexes have not consistently demonstrated incremental validity above and beyond other psychometric data. In addition, we summarize the results of a new meta-analysis intended to examine the capacity of these three instruments to detect child sexual abuse. Although some projective instruments were better than chance at detecting child sexual abuse, there were virtually no replicated findings across independent investigative teams. This meta-analysis also provides the first clear evidence of substantial file drawer effects in the projectives literature, as the effect sizes from published studies markedly exceeded those from unpublished studies. We conclude with recommendations regarding the (a) construction of projective techniques with adequate validity, (b) forensic and clinical use of projective techniques, and (c) education and training of future psychologists regarding projective techniques. © 2000 Association for Psychological Science.

  3. Five decades of promotion techniques in cigarette advertising: a longitudinal content analysis.

    PubMed

    Paek, Hye-Jin; Reid, Leonard N; Jeong, Hyun Ju; Choi, Hojoon; Krugman, Dean

    2012-01-01

    This study examines frequencies and types of promotion techniques featured in five decades of cigarette advertising relative to five major smoking eras. Analysis of 1,133 cigarette advertisements collected through multistage sampling of 1954 through 2003 issues of three youth-oriented magazines found that 7.6% of the analyzed ads featured at least one promotion technique. Across smoking eras the proportion of promotion in the ads steadily increased from 1.6% in the "pre-broadcast ban era" to 10.9% in the "the pre-Master Settlement Agreement (MSA) era" and 9% in "post-MSA era." The increased use of sponsorships/events in cigarette ads for youth-oriented brands warrants more attention from tobacco control experts and government regulators.

  4. Application of Spectral Analysis Techniques in the Intercomparison of Aerosol Data. Part II: Using Maximum Covariance Analysis to Effectively Compare Spatiotemporal Variability of Satellite and AERONET Measured Aerosol Optical Depth

    NASA Technical Reports Server (NTRS)

    Li, Jing; Carlson, Barbara E.; Lacis, Andrew A.

    2014-01-01

    Moderate Resolution Imaging SpectroRadiometer (MODIS) and Multi-angle Imaging Spectroradiomater (MISR) provide regular aerosol observations with global coverage. It is essential to examine the coherency between space- and ground-measured aerosol parameters in representing aerosol spatial and temporal variability, especially in the climate forcing and model validation context. In this paper, we introduce Maximum Covariance Analysis (MCA), also known as Singular Value Decomposition analysis as an effective way to compare correlated aerosol spatial and temporal patterns between satellite measurements and AERONET data. This technique not only successfully extracts the variability of major aerosol regimes but also allows the simultaneous examination of the aerosol variability both spatially and temporally. More importantly, it well accommodates the sparsely distributed AERONET data, for which other spectral decomposition methods, such as Principal Component Analysis, do not yield satisfactory results. The comparison shows overall good agreement between MODIS/MISR and AERONET AOD variability. The correlations between the first three modes of MCA results for both MODIS/AERONET and MISR/ AERONET are above 0.8 for the full data set and above 0.75 for the AOD anomaly data. The correlations between MODIS and MISR modes are also quite high (greater than 0.9). We also examine the extent of spatial agreement between satellite and AERONET AOD data at the selected stations. Some sites with disagreements in the MCA results, such as Kanpur, also have low spatial coherency. This should be associated partly with high AOD spatial variability and partly with uncertainties in satellite retrievals due to the seasonally varying aerosol types and surface properties.

  5. Effect of airway clearance techniques on the efficacy of the sputum induction procedure.

    PubMed

    Elkins, M R; Lane, T; Goldberg, H; Pagliuso, J; Garske, L A; Hector, E; Marchetto, L; Alison, J A; Bye, P T P

    2005-11-01

    Sputum induction is used in the early identification of tuberculosis (TB) and pneumocystis infections of the lung. Although manual physiotherapy techniques to clear the airways are often incorporated in the sputum induction procedure, their efficacy in this setting is unknown. This randomised, crossover trial enrolled adults referred for sputum induction for suspected TB and pneumocystis infections of the lung. All participants underwent two sputum induction procedures, inhaling 3% saline via ultrasonic nebuliser. During one randomly allocated procedure, airway clearance techniques (chest wall percussion, vibration, huffing) were incorporated. In total, 59 participants completed the trial. The airway clearance techniques had no significant effect on how the test was tolerated, the volume expectorated or the quality of the sample obtained (assessed by the presence of alveolar macrophages). The techniques did not significantly affect how often the test identified a suspected organism, nor the sensitivity or specificity of sputum induction. In conclusion, the study was unable to demonstrate any effect of airway clearance techniques on the sputum induction procedure. The results provide some justification for not including airway clearance techniques as part of the sputum induction procedure.

  6. Predictive analysis effectiveness in determining the epidemic disease infected area

    NASA Astrophysics Data System (ADS)

    Ibrahim, Najihah; Akhir, Nur Shazwani Md.; Hassan, Fadratul Hafinaz

    2017-10-01

    Epidemic disease outbreak had caused nowadays community to raise their great concern over the infectious disease controlling, preventing and handling methods to diminish the disease dissemination percentage and infected area. Backpropagation method was used for the counter measure and prediction analysis of the epidemic disease. The predictive analysis based on the backpropagation method can be determine via machine learning process that promotes the artificial intelligent in pattern recognition, statistics and features selection. This computational learning process will be integrated with data mining by measuring the score output as the classifier to the given set of input features through classification technique. The classification technique is the features selection of the disease dissemination factors that likely have strong interconnection between each other in causing infectious disease outbreaks. The predictive analysis of epidemic disease in determining the infected area was introduced in this preliminary study by using the backpropagation method in observation of other's findings. This study will classify the epidemic disease dissemination factors as the features for weight adjustment on the prediction of epidemic disease outbreaks. Through this preliminary study, the predictive analysis is proven to be effective method in determining the epidemic disease infected area by minimizing the error value through the features classification.

  7. Techniques for Fault Detection and Visualization of Telemetry Dependence Relationships for Root Cause Fault Analysis in Complex Systems

    NASA Astrophysics Data System (ADS)

    Guy, Nathaniel

    This thesis explores new ways of looking at telemetry data, from a time-correlative perspective, in order to see patterns within the data that may suggest root causes of system faults. It was thought initially that visualizing an animated Pearson Correlation Coefficient (PCC) matrix for telemetry channels would be sufficient to give new understanding; however, testing showed that the high dimensionality and inability to easily look at change over time in this approach impeded understanding. Different correlative techniques, combined with the time curve visualization proposed by Bach et al (2015), were adapted to visualize both raw telemetry and telemetry data correlations. Review revealed that these new techniques give insights into the data, and an intuitive grasp of data families, which show the effectiveness of this approach for enhancing system understanding and assisting with root cause analysis for complex aerospace systems.

  8. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  9. Evaluation of agave fiber delignification by means of microscopy techniques and image analysis.

    PubMed

    Hernández-Hernández, Hilda M; Chanona-Pérez, Jorge J; Calderón-Domínguez, Georgina; Perea-Flores, María J; Mendoza-Pérez, Jorge A; Vega, Alberto; Ligero, Pablo; Palacios-González, Eduardo; Farrera-Rebollo, Reynold R

    2014-10-01

    Recently, the use of different types of natural fibers to produce paper and textiles from agave plants has been proposed. Agave atrovirens can be a good source of cellulose and lignin; nevertheless, the microstructural changes that happen during delignification have scarcely been studied. The aim of this work was to study the microstructural changes that occur during the delignification of agave fibers by means of microscopy techniques and image analysis. The fibers of A. atrovirens were obtained from leaves using convective drying, milling, and sieving. Fibers were processed using the Acetosolv pulping method at different concentrations of acetic acid; increasing acid concentration promoted higher levels of delignification, structural damage, and the breakdown of fiber clumps. Delignification followed by spectrometric analysis and microstructural studies were carried out by light, confocal laser scanning and scanning electron microscopy and showed that the delignification process follows three stages: initial, bulk, and residual. Microscopy techniques and image analysis were efficient tools for microstructural characterization during delignification of agave fibers, allowing quantitative evaluation of the process and the development of linear prediction models. The data obtained integrated numerical and microstructural information that could be valuable for the study of pulping of lignocellulosic materials.

  10. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    NASA Technical Reports Server (NTRS)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  11. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  12. An interdisciplinary analysis of ERTS data for Colorado mountain environments using ADP techniques

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M. (Principal Investigator)

    1972-01-01

    There are no author-identified significant results in this report. Research efforts have been placed on: (1) location, acquisition, and preparation of baseline information necessary for the computer analysis, and (2) refinement of techniques for analysis of MSS data obtained from ERTS-1. Analysis of the first frame of data collected by the ERTS-1 multispectral scanner system over the Lake Texoma area has proven very valuable for determining the best procedures to follow in working with and analyzing ERTS data. Progress on the following projects is described: (1) cover type mapping; (2) geomorphology; and hydrologic feature surveys.

  13. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  14. SIR/CAR Analysis Technique for Voluntary School Sport or Amateur Athletic Organizations. A SIR/CAR Application.

    ERIC Educational Resources Information Center

    Moriarty, Dick; Zarebski, John

    This paper delineates the exact methodology developed by the Sports Institute for Research/Change Agent Research (SIR/CAR) for applying a systems analysis technique to a voluntary mutual benefit organization, such as a school or amateur athletic group. The functions of the technique are to compare avowed and actual behavior, to utilize group…

  15. Analysis of motor fan radiated sound and vibration waveform by automatic pattern recognition technique using "Mahalanobis distance"

    NASA Astrophysics Data System (ADS)

    Toma, Eiji

    2018-06-01

    In recent years, as the weight of IT equipment has been reduced, the demand for motor fans for cooling the interior of electronic equipment is on the rise. Sensory test technique by inspectors is the mainstream for quality inspection of motor fans in the field. This sensory test requires a lot of experience to accurately diagnose differences in subtle sounds (sound pressures) of the fans, and the judgment varies depending on the condition of the inspector and the environment. In order to solve these quality problems, development of an analysis method capable of quantitatively and automatically diagnosing the sound/vibration level of a fan is required. In this study, it was clarified that the analysis method applying the MT system based on the waveform information of noise and vibration is more effective than the conventional frequency analysis method for the discrimination diagnosis technology of normal and abnormal items. Furthermore, it was found that due to the automation of the vibration waveform analysis system, there was a factor influencing the discrimination accuracy in relation between the fan installation posture and the vibration waveform.

  16. Advantage of the modified Lunn-McNeil technique over Kalbfleisch-Prentice technique in competing risks

    NASA Astrophysics Data System (ADS)

    Lukman, Iing; Ibrahim, Noor A.; Daud, Isa B.; Maarof, Fauziah; Hassan, Mohd N.

    2002-03-01

    Survival analysis algorithm is often applied in the data mining process. Cox regression is one of the survival analysis tools that has been used in many areas, and it can be used to analyze the failure times of aircraft crashed. Another survival analysis tool is the competing risks where we have more than one cause of failure acting simultaneously. Lunn-McNeil analyzed the competing risks in the survival model using Cox regression with censored data. The modified Lunn-McNeil technique is a simplify of the Lunn-McNeil technique. The Kalbfleisch-Prentice technique is involving fitting models separately from each type of failure, treating other failure types as censored. To compare the two techniques, (the modified Lunn-McNeil and Kalbfleisch-Prentice) a simulation study was performed. Samples with various sizes and censoring percentages were generated and fitted using both techniques. The study was conducted by comparing the inference of models, using Root Mean Square Error (RMSE), the power tests, and the Schoenfeld residual analysis. The power tests in this study were likelihood ratio test, Rao-score test, and Wald statistics. The Schoenfeld residual analysis was conducted to check the proportionality of the model through its covariates. The estimated parameters were computed for the cause-specific hazard situation. Results showed that the modified Lunn-McNeil technique was better than the Kalbfleisch-Prentice technique based on the RMSE measurement and Schoenfeld residual analysis. However, the Kalbfleisch-Prentice technique was better than the modified Lunn-McNeil technique based on power tests measurement.

  17. The feasibility of implementing the data analysis and reporting techniques (DART) package in Virginia.

    DOT National Transportation Integrated Search

    1980-01-01

    This project was undertaken for the Virginia Department of Transportation Safety to assess the feasibility of implementing the Data Analysis and Reporting Techniques (DART) computer software system in Virginia. Following a review of available literat...

  18. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    NASA Astrophysics Data System (ADS)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  19. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Tolson, Bryan

    2017-04-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters or model processes. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method independency of the convergence testing method, we applied it to three widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991, Campolongo et al., 2000), the variance-based Sobol' method (Solbol' 1993, Saltelli et al. 2010) and a derivative-based method known as Parameter Importance index (Goehler et al. 2013). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different

  20. Cost Effective Repair Techniques for Turbine Airfoils. Volume I

    DTIC Science & Technology

    1978-11-01

    Turbine blades and vanes in current engines are subjected to the most hostile environment...payoff potential in turbine vanes / blades . The criteria used included: • Incidence of damage - Scrapped or damaged turbine airfoils at the ALC centers...Corporate Author: GENERAL ELECTRIC CO CINCINNATI OHIO AIRCRAFT ENGINE GROUP Unclassified Title: (U) Cost Effective Repair Techniques for Turbine

  1. An effective ostrich oil bleaching technique using peroxide value as an indicator.

    PubMed

    Palanisamy, Uma Devi; Sivanathan, Muniswaran; Radhakrishnan, Ammu Kutty; Haleagrahara, Nagaraja; Subramaniam, Thavamanithevi; Chiew, Gan Seng

    2011-07-05

    Ostrich oil has been used extensively in the cosmetic and pharmaceutical industries. However, rancidity causes undesirable chemical changes in flavour, colour, odour and nutritional value. Bleaching is an important process in refining ostrich oil. Bleaching refers to the removal of certain minor constituents (colour pigments, free fatty acid, peroxides, odour and non-fatty materials) from crude fats and oils to yield purified glycerides. There is a need to optimize the bleaching process of crude ostrich oil prior to its use for therapeutic purposes. The objective of our study was to establish an effective method to bleach ostrich oil using peroxide value as an indicator of refinement. In our study, we showed that natural earth clay was better than bentonite and acid-activated clay to bleach ostrich oil. It was also found that 1 hour incubation at a 150 °C was suitable to lower peroxide value by 90%. In addition, the nitrogen trap technique in the bleaching process was as effective as the continuous nitrogen flow technique and as such would be the recommended technique due to its cost effectiveness.

  2. Effective gene prediction by high resolution frequency estimator based on least-norm solution technique

    PubMed Central

    2014-01-01

    Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895

  3. The effects of perineal management techniques on labor complications

    PubMed Central

    Fahami, Fariba; Shokoohi, Zohreh; Kianpour, Mariam

    2012-01-01

    Background: Many women suffer from perineal trauma during the normal vaginal delivery. Perineal trauma is mainly associated with pain and complications after the childbirth. Perineal management techniques can play a significant role in perineal trauma reduction. This study aimed to compare the effects of perineal management techniques (hands-off technique, Ritgen maneuver and perineal massage using a lubricant during delivery) on the labor complications. Materials and Methods: This quasi-experimental clinical trial was conducted on 99 primiparous women who referred to Daran Hospital, Isfahan, Iran for normal vaginal delivery in 2009. The subjects were selected using a convenient method and randomly assigned to three groups of Ritgen maneuver, hands-off technique and perineal massage with lubricant. A questionnaire was used to determine the demographic characteristics of the participants and complications after birth. The short form of McGill Pain Questionnaire and the visual analogue scale for pain were also employed. The incidence and degree of perineal tears were evaluated immediately after delivery. Moreover, the incidence and severity of perineal pain were assessed 24 hours and also 6 weeks after delivery. Findings: In the Ritgen maneuver group, the frequency of tears, the relative frequency of tear degrees, the severity of perineal pain 24 hours after delivery and the frequency of pain and perineal pain severity 6 weeks after delivery were significantly different from the other two methods. Conclusions: Hands-off technique during parturition of the neonate's head was associated with fewer complications after delivery. It was even better than perineal massage during the parturition. PMID:23493441

  4. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    NASA Astrophysics Data System (ADS)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  5. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  6. TOPICAL REVIEW: Human soft tissue analysis using x-ray or gamma-ray techniques

    NASA Astrophysics Data System (ADS)

    Theodorakou, C.; Farquharson, M. J.

    2008-06-01

    This topical review is intended to describe the x-ray techniques used for human soft tissue analysis. X-ray techniques have been applied to human soft tissue characterization and interesting results have been presented over the last few decades. The motivation behind such studies is to provide improved patient outcome by using the data obtained to better understand a disease process and improve diagnosis. An overview of theoretical background as well as a complete set of references is presented. For each study, a brief summary of the methodology and results is given. The x-ray techniques include x-ray diffraction, x-ray fluorescence, Compton scattering, Compton to coherent scattering ratio and attenuation measurements. The soft tissues that have been classified using x-rays or gamma rays include brain, breast, colon, fat, kidney, liver, lung, muscle, prostate, skin, thyroid and uterus.

  7. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  8. Improving Students' Learning With Effective Learning Techniques: Promising Directions From Cognitive and Educational Psychology.

    PubMed

    Dunlosky, John; Rawson, Katherine A; Marsh, Elizabeth J; Nathan, Mitchell J; Willingham, Daniel T

    2013-01-01

    Many students are being left behind by an educational system that some people believe is in crisis. Improving educational outcomes will require efforts on many fronts, but a central premise of this monograph is that one part of a solution involves helping students to better regulate their learning through the use of effective learning techniques. Fortunately, cognitive and educational psychologists have been developing and evaluating easy-to-use learning techniques that could help students achieve their learning goals. In this monograph, we discuss 10 learning techniques in detail and offer recommendations about their relative utility. We selected techniques that were expected to be relatively easy to use and hence could be adopted by many students. Also, some techniques (e.g., highlighting and rereading) were selected because students report relying heavily on them, which makes it especially important to examine how well they work. The techniques include elaborative interrogation, self-explanation, summarization, highlighting (or underlining), the keyword mnemonic, imagery use for text learning, rereading, practice testing, distributed practice, and interleaved practice. To offer recommendations about the relative utility of these techniques, we evaluated whether their benefits generalize across four categories of variables: learning conditions, student characteristics, materials, and criterion tasks. Learning conditions include aspects of the learning environment in which the technique is implemented, such as whether a student studies alone or with a group. Student characteristics include variables such as age, ability, and level of prior knowledge. Materials vary from simple concepts to mathematical problems to complicated science texts. Criterion tasks include different outcome measures that are relevant to student achievement, such as those tapping memory, problem solving, and comprehension. We attempted to provide thorough reviews for each technique, so this

  9. An Effective Technique for Endoscopic Resection of Advanced Stage Angiofibroma

    PubMed Central

    Mohammadi Ardehali, Mojtaba; Samimi, Seyyed Hadi; Bakhshaee, Mehdi

    2014-01-01

    Introduction: In recent years, the surgical management of angiofibroma has been greatly influenced by the use of endoscopic techniques. However, large tumors that extend into difficult anatomic sites present major challenges for management by either endoscopy or an open-surgery approach which needs new technique for the complete en block resection. Materials and Methods: In a prospective observational study we developed an endoscopic transnasal technique for the resection of angiofibroma via pushing and pulling the mass with 1/100000 soaked adrenalin tampons. Thirty two patients were treated using this endoscopic technique over 7 years. The mean follow-up period was 36 months. The main outcomes measured were tumor staging, average blood loss, complications, length of hospitalization, and residual and/or recurrence rate of the tumor. Results: According to the Radkowski staging, 23,5, and 4 patients were at stage IIC, IIIA, and IIIB, respectively. Twenty five patients were operated on exclusively via transnasal endoscopy while 7 patients were managed using endoscopy-assisted open-surgery techniques. Mean blood loss in patients was 1261± 893 cc. The recurrence rate was 21.88% (7 cases) at two years following surgery. Mean hospitalization time was 3.56 ± 0.6 days. Conclusion: Using this effective technique, endoscopic removal of more highly advanced angiofibroma is possible. Better visualization, less intraoperative blood loss, lower rates of complication and recurrence, and shorter hospitalization time are some of the advantages. PMID:24505571

  10. Electron-Beam-Induced Deposition as a Technique for Analysis of Precursor Molecule Diffusion Barriers and Prefactors.

    PubMed

    Cullen, Jared; Lobo, Charlene J; Ford, Michael J; Toth, Milos

    2015-09-30

    Electron-beam-induced deposition (EBID) is a direct-write chemical vapor deposition technique in which an electron beam is used for precursor dissociation. Here we show that Arrhenius analysis of the deposition rates of nanostructures grown by EBID can be used to deduce the diffusion energies and corresponding preexponential factors of EBID precursor molecules. We explain the limitations of this approach, define growth conditions needed to minimize errors, and explain why the errors increase systematically as EBID parameters diverge from ideal growth conditions. Under suitable deposition conditions, EBID can be used as a localized technique for analysis of adsorption barriers and prefactors.

  11. Effect of workload setting on propulsion technique in handrim wheelchair propulsion.

    PubMed

    van Drongelen, Stefan; Arnet, Ursina; Veeger, Dirkjan H E J; van der Woude, Lucas H V

    2013-03-01

    To investigate the influence of workload setting (speed at constant power, method to impose power) on the propulsion technique (i.e. force and timing characteristics) in handrim wheelchair propulsion. Twelve able-bodied men participated in this study. External forces were measured during handrim wheelchair propulsion on a motor driven treadmill at different velocities and constant power output (to test the forced effect of speed) and at power outputs imposed by incline vs. pulley system (to test the effect of method to impose power). Outcome measures were the force and timing variables of the propulsion technique. FEF and timing variables showed significant differences between the speed conditions when propelling at the same power output (p < 0.01). Push time was reduced while push angle increased. The method to impose power only showed slight differences in the timing variables, however not in the force variables. Researchers and clinicians must be aware of testing and evaluation conditions that may differently affect propulsion technique parameters despite an overall constant power output. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  12. Spectroscopic analysis of solar and cosmic X-ray spectra. 1: The nature of cosmic X-ray spectra and proposed analytical techniques

    NASA Technical Reports Server (NTRS)

    Walker, A. B. C., Jr.

    1975-01-01

    Techniques for the study of the solar corona are reviewed as an introduction to a discussion of modifications required for the study of cosmic sources. Spectroscopic analysis of individual sources and the interstellar medium is considered. The latter was studied via analysis of its effect on the spectra of selected individual sources. The effects of various characteristics of the ISM, including the presence of grains, molecules, and ionization, are first discussed, and the development of ISM models is described. The expected spectral structure of individual cosmic sources is then reviewed with emphasis on supernovae remnants and binary X-ray sources. The observational and analytical requirements imposed by the characteristics of these sources are identified, and prospects for the analysis of abundances and the study of physical parameters within them are assessed. Prospects for the spectroscopic study of other classes of X-ray sources are also discussed.

  13. Spartan service module finite element modeling technique and analysis

    NASA Technical Reports Server (NTRS)

    Lindenmoyer, A. J.

    1985-01-01

    Sounding rockets have served as a relatively inexpensive and easy method of carrying experiments into the upper atmosphere. Limited observation time and pointing capabilities suggested the development of a new sounding rocket type carrier compatible with NASA's Space Transportation System. This concept evolved into the Spartan program, now credited with a successful Spartan 101 mission launched in June 1985. The next series of Spartans will use a service module primary structure. This newly designed reusable and universal component in the Spartan carrier system required thorough analysis and evaluation for flight certification. Using advanced finite element modeling techniques, the structure was analyzed and determined acceptable by meeting strict design goals and will be tested for verification of the analytical results.

  14. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  15. Comparative Analysis of RF Emission Based Fingerprinting Techniques for ZigBee Device Classification

    DTIC Science & Technology

    quantify the differences invarious RF fingerprinting techniques via comparative analysis of MDA/ML classification results. The findings herein demonstrate...correct classification rates followed by COR-DNA and then RF-DNA in most test cases and especially in low Eb/N0 ranges, where ZigBee is designed to operate.

  16. Assessing the effect of different operation techniques on postoperative duplex ultrasound quality after carotid endarterectomy.

    PubMed

    Grambow, E; Heller, T; Wieneke, P; Weiß, C; Klar, E; Weinrich, M

    2018-01-01

    Duplex ultrasound is the first choice in diagnostics and surveillance of stenoses of the internal carotid arteries before and even after surgery. Therefore, the quality of duplex ultrasound is crucial to investigate these vascular pathologies. Aim of this study was the evaluation whether different surgical techniques affect the postoperative quality of duplex ultrasound. In a time period from January to May 2015 duplex ultrasound of the cervical vessels was performed in 75 patients after unilateral endarterectomy of the internal carotid artery at our department between 2006 and 2012. Thereby, the non-operated contralateral side served as a control. Study groups were defined by the surgical techniques of eversion- or thrombendarterectomy with patch plasty using different patch materials and/or a haemostatic sealant. Duplex ultrasound analysis included acoustic impedance, extinction of ultrasound, thickness of skin and individual anatomic aspects of the patients. Carotid endarterectomy itself reduced intravascular grey levels, skin thickness and increased extinction of duplex ultrasound when compared to the non-operated side of the neck. In contrast, neither the kind of chosen operative technique nor the use of different patch materials or the application of a haemostatic sealant showed an effect in this regards. Whereas carotid endarterectomy per se worsens the quality of postoperative duplex ultrasound, the different analysed surgical techniques as well as used patches and the application of a haemostatic sealant can be assumed to be equal regarding the quality of postoperative ultrasound.

  17. Edge compression techniques for visualization of dense directed graphs.

    PubMed

    Dwyer, Tim; Henry Riche, Nathalie; Marriott, Kim; Mears, Christopher

    2013-12-01

    We explore the effectiveness of visualizing dense directed graphs by replacing individual edges with edges connected to 'modules'-or groups of nodes-such that the new edges imply aggregate connectivity. We only consider techniques that offer a lossless compression: that is, where the entire graph can still be read from the compressed version. The techniques considered are: a simple grouping of nodes with identical neighbor sets; Modular Decomposition which permits internal structure in modules and allows them to be nested; and Power Graph Analysis which further allows edges to cross module boundaries. These techniques all have the same goal--to compress the set of edges that need to be rendered to fully convey connectivity--but each successive relaxation of the module definition permits fewer edges to be drawn in the rendered graph. Each successive technique also, we hypothesize, requires a higher degree of mental effort to interpret. We test this hypothetical trade-off with two studies involving human participants. For Power Graph Analysis we propose a novel optimal technique based on constraint programming. This enables us to explore the parameter space for the technique more precisely than could be achieved with a heuristic. Although applicable to many domains, we are motivated by--and discuss in particular--the application to software dependency analysis.

  18. An Analysis of Nondestructive Evaluation Techniques for Polymer Matrix Composite Sandwich Materials

    NASA Technical Reports Server (NTRS)

    Cosgriff, Laura M.; Roberts, Gary D.; Binienda, Wieslaw K.; Zheng, Diahua; Averbeck, Timothy; Roth, Donald J.; Jeanneau, Philippe

    2006-01-01

    Structural sandwich materials composed of triaxially braided polymer matrix composite material face sheets sandwiching a foam core are being utilized for applications including aerospace components and recreational equipment. Since full scale components are being made from these sandwich materials, it is necessary to develop proper inspection practices for their manufacture and in-field use. Specifically, nondestructive evaluation (NDE) techniques need to be investigated for analysis of components made from these materials. Hockey blades made from sandwich materials and a flat sandwich sample were examined with multiple NDE techniques including thermographic, radiographic, and shearographic methods to investigate damage induced in the blades and flat panel components. Hockey blades used during actual play and a flat polymer matrix composite sandwich sample with damage inserted into the foam core were investigated with each technique. NDE images from the samples were presented and discussed. Structural elements within each blade were observed with radiographic imaging. Damaged regions and some structural elements of the hockey blades were identified with thermographic imaging. Structural elements, damaged regions, and other material variations were detected in the hockey blades with shearography. Each technique s advantages and disadvantages were considered in making recommendations for inspection of components made from these types of materials.

  19. High resolution frequency analysis techniques with application to the redshift experiment

    NASA Technical Reports Server (NTRS)

    Decher, R.; Teuber, D.

    1975-01-01

    High resolution frequency analysis methods, with application to the gravitational probe redshift experiment, are discussed. For this experiment a resolution of .00001 Hz is required to measure a slowly varying, low frequency signal of approximately 1 Hz. Major building blocks include fast Fourier transform, discrete Fourier transform, Lagrange interpolation, golden section search, and adaptive matched filter technique. Accuracy, resolution, and computer effort of these methods are investigated, including test runs on an IBM 360/65 computer.

  20. Coupling Analysis of Heat Island Effects, Vegetation Coverage and Urban Flood in Wuhan

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Liu, Q.; Fan, W.; Wang, G.

    2018-04-01

    In this paper, satellite image, remote sensing technique and geographic information system technique are main technical bases. Spectral and other factors comprehensive analysis and visual interpretation are main methods. We use GF-1 and Landsat8 remote sensing satellite image of Wuhan as data source, and from which we extract vegetation distribution, urban heat island relative intensity distribution map and urban flood submergence range. Based on the extracted information, through spatial analysis and regression analysis, we find correlations among heat island effect, vegetation coverage and urban flood. The results show that there is a high degree of overlap between of urban heat island and urban flood. The area of urban heat island has buildings with little vegetation cover, which may be one of the reasons for the local heavy rainstorms. Furthermore, the urban heat island has a negative correlation with vegetation coverage, and the heat island effect can be alleviated by the vegetation to a certain extent. So it is easy to understand that the new industrial zones and commercial areas which under constructions distribute in the city, these land surfaces becoming bare or have low vegetation coverage, can form new heat islands easily.

  1. Laser techniques in conservation in Europe

    NASA Astrophysics Data System (ADS)

    Salimbeni, Renzo

    2005-06-01

    The state of the art of laser techniques employed in conservation of cultural heritage is continuously growing in Europe. Many research projects organised at the European level have contributed to this achievement, being complementary to the development carried out at national level. The COST Action G7 is playing its unique role since the year 2000 in promoting the experimentation, comparing the experiences and disseminating best practices. This role has been particularly effective for monitoring of the results of many short-term research projects completed along the G7 Action lifetime. After that several laser cleaning techniques have been followed and evaluated it appears now clear an evolution of the systems, a specialization of the cleaning task, the achievement of side-effect free procedures. The validation of these advanced cleaning techniques has been extensive and diffused in many European countries, especially for stone and metals. Laser-based diagnostics have also specialised their tasks toward material analysis, defects detection and multidimensional documentation. Laser and optical methods successfully monitor deterioration effects. In many European countries interdisciplinary networks are managing the experimentation of these techniques giving them a sound scientific approach, but also a technology transfer to end-users. So doing the appreciation for these techniques is growing in all the conservation institutions involved at national level, disseminating a positive evaluation about the benefits provided by laser techniques in conservation. Several laser systems became products for the activity of professional restorers and their increasing sales demonstrate a growing utilisation throughout all Europe.

  2. Geometric parameter analysis to predetermine optimal radiosurgery technique for the treatment of arteriovenous malformation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mestrovic, Ante; Clark, Brenda G.; Department of Medical Physics, British Columbia Cancer Agency, Vancouver, British Columbia

    2005-11-01

    Purpose: To develop a method of predicting the values of dose distribution parameters of different radiosurgery techniques for treatment of arteriovenous malformation (AVM) based on internal geometric parameters. Methods and Materials: For each of 18 previously treated AVM patients, four treatment plans were created: circular collimator arcs, dynamic conformal arcs, fixed conformal fields, and intensity-modulated radiosurgery. An algorithm was developed to characterize the target and critical structure shape complexity and the position of the critical structures with respect to the target. Multiple regression was employed to establish the correlation between the internal geometric parameters and the dose distribution for differentmore » treatment techniques. The results from the model were applied to predict the dosimetric outcomes of different radiosurgery techniques and select the optimal radiosurgery technique for a number of AVM patients. Results: Several internal geometric parameters showing statistically significant correlation (p < 0.05) with the treatment planning results for each technique were identified. The target volume and the average minimum distance between the target and the critical structures were the most effective predictors for normal tissue dose distribution. The structure overlap volume with the target and the mean distance between the target and the critical structure were the most effective predictors for critical structure dose distribution. The predicted values of dose distribution parameters of different radiosurgery techniques were in close agreement with the original data. Conclusions: A statistical model has been described that successfully predicts the values of dose distribution parameters of different radiosurgery techniques and may be used to predetermine the optimal technique on a patient-to-patient basis.« less

  3. Characteristic vector analysis of inflection ratio spectra: New technique for analysis of ocean color data

    NASA Technical Reports Server (NTRS)

    Grew, G. W.

    1985-01-01

    Characteristic vector analysis applied to inflection ratio spectra is a new approach to analyzing spectral data. The technique applied to remote data collected with the multichannel ocean color sensor (MOCS), a passive sensor, simultaneously maps the distribution of two different phytopigments, chlorophyll alpha and phycoerythrin, the ocean. The data set presented is from a series of warm core ring missions conducted during 1982. The data compare favorably with a theoretical model and with data collected on the same mission by an active sensor, the airborne oceanographic lidar (AOL).

  4. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System

    PubMed Central

    Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice. PMID:28812013

  5. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    PubMed

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  6. Evidence for the effectiveness of Alexander Technique lessons in medical and health-related conditions: a systematic review.

    PubMed

    Woodman, J P; Moore, N R

    2012-01-01

    Complementary medicine and alternative approaches to chronic and intractable health conditions are increasingly being used, and require critical evaluation. The aim of this review was to systematically evaluate available evidence for the effectiveness and safety of instruction in the Alexander Technique in health-related conditions. PUBMED, EMBASE, PSYCHINFO, ISI Web-of-Knowledge, AMED, CINHAL-plus, Cochrane library and Evidence-based Medicine Reviews were searched to July 2011. Inclusion criteria were prospective studies evaluating Alexander Technique instruction (individual lessons or group delivery) as an intervention for any medical indication/health-related condition. Studies were categorised and data extracted on study population, randomisation method, nature of intervention and control, practitioner characteristics, validity and reliability of outcome measures, completeness of follow-up and statistical analyses.   Of 271 publications identified, 18 were selected: three randomised, controlled trials (RCTs), two controlled non-randomised studies, eight non-controlled studies, four qualitative analyses and one health economic analysis. One well-designed, well-conducted RCT demonstrated that, compared with usual GP care, Alexander Technique lessons led to significant long-term reductions in back pain and incapacity caused by chronic back pain. The results were broadly supported by a smaller, earlier RCT in chronic back pain. The third RCT, a small, well-designed, well-conducted study in individuals with Parkinson's disease, showed a sustained increased ability to carry out everyday activities following Alexander lessons, compared with usual care. The 15 non-RCT studies are also reviewed. Strong evidence exists for the effectiveness of Alexander Technique lessons for chronic back pain and moderate evidence in Parkinson's-associated disability. Preliminary evidence suggests that Alexander Technique lessons may lead to improvements in balance skills in the

  7. Sensitivity analysis for direct and indirect effects in the presence of exposure-induced mediator-outcome confounders

    PubMed Central

    Chiba, Yasutaka

    2014-01-01

    Questions of mediation are often of interest in reasoning about mechanisms, and methods have been developed to address these questions. However, these methods make strong assumptions about the absence of confounding. Even if exposure is randomized, there may be mediator-outcome confounding variables. Inference about direct and indirect effects is particularly challenging if these mediator-outcome confounders are affected by the exposure because in this case these effects are not identified irrespective of whether data is available on these exposure-induced mediator-outcome confounders. In this paper, we provide a sensitivity analysis technique for natural direct and indirect effects that is applicable even if there are mediator-outcome confounders affected by the exposure. We give techniques for both the difference and risk ratio scales and compare the technique to other possible approaches. PMID:25580387

  8. Harmonic versus LigaSure hemostasis technique in thyroid surgery: A meta-analysis

    PubMed Central

    Upadhyaya, Arun; Hu, Tianpeng; Meng, Zhaowei; Li, Xue; He, Xianghui; Tian, Weijun; Jia, Qiang; Tan, Jian

    2016-01-01

    Harmonic scalpel and LigaSure vessel sealing systems have been suggested as options for saving surgical time and reducing postoperative complications. The aim of the present meta-analysis was to compare surgical time, postoperative complications and other parameters between them in for the open thyroidectomy procedure. Studies were retrieved from MEDLINE, Cochrane Library, EMBASE and ISI Web of Science until December 2015. All the randomized controlled trials (RCTs) comparing Harmonic scalpel and LigaSure during open thyroidectomy were selected. Following data extraction, statistical analyses were performed. Among the 24 studies that were evaluated for eligibility, 7 RCTs with 981 patients were included. The Harmonic scalpel significantly reduced surgical time compared with LigaSure techniques (8.79 min; 95% confidence interval, −15.91 to −1.67; P=0.02). However, no significant difference was observed for the intraoperative blood loss, postoperative blood loss, duration of hospital stay, thyroid weight and serum calcium level postoperatively in either group. The present meta-analysis indicated superiority of Harmonic Scalpel only in terms of surgical time compared with LigaSure hemostasis techniques in open thyroid surgery. PMID:27446546

  9. Depth resolved compositional analysis of aluminium oxide thin film using non-destructive soft x-ray reflectivity technique

    NASA Astrophysics Data System (ADS)

    Sinha, Mangalika; Modi, Mohammed H.

    2017-10-01

    In-depth compositional analysis of 240 Å thick aluminium oxide thin film has been carried out using soft x-ray reflectivity (SXR) and x-ray photoelectron spectroscopy technique (XPS). The compositional details of the film is estimated by modelling the optical index profile obtained from the SXR measurements over 60-200 Å wavelength region. The SXR measurements are carried out at Indus-1 reflectivity beamline. The method suggests that the principal film region is comprised of Al2O3 and AlOx (x = 1.6) phases whereas the interface region comprised of SiO2 and AlOx (x = 1.6) mixture. The soft x-ray reflectivity technique combined with XPS measurements explains the compositional details of principal layer. Since the interface region cannot be analyzed with the XPS technique in a non-destructive manner in such a case the SXR technique is a powerful tool for nondestructive compositional analysis of interface region.

  10. Development and verification of local/global analysis techniques for laminated composites

    NASA Technical Reports Server (NTRS)

    Griffin, O. Hayden, Jr.

    1989-01-01

    Analysis and design methods for laminated composite materials have been the subject of considerable research over the past 20 years, and are currently well developed. In performing the detailed three-dimensional analyses which are often required in proximity to discontinuities, however, analysts often encounter difficulties due to large models. Even with the current availability of powerful computers, models which are too large to run, either from a resource or time standpoint, are often required. There are several approaches which can permit such analyses, including substructuring, use of superelements or transition elements, and the global/local approach. This effort is based on the so-called zoom technique to global/local analysis, where a global analysis is run, with the results of that analysis applied to a smaller region as boundary conditions, in as many iterations as is required to attain an analysis of the desired region. Before beginning the global/local analyses, it was necessary to evaluate the accuracy of the three-dimensional elements currently implemented in the Computational Structural Mechanics (CSM) Testbed. It was also desired to install, using the Experimental Element Capability, a number of displacement formulation elements which have well known behavior when used for analysis of laminated composites.

  11. Correlative visualization techniques for multidimensional data

    NASA Technical Reports Server (NTRS)

    Treinish, Lloyd A.; Goettsche, Craig

    1989-01-01

    Critical to the understanding of data is the ability to provide pictorial or visual representation of those data, particularly in support of correlative data analysis. Despite the advancement of visualization techniques for scientific data over the last several years, there are still significant problems in bringing today's hardware and software technology into the hands of the typical scientist. For example, there are other computer science domains outside of computer graphics that are required to make visualization effective such as data management. Well-defined, flexible mechanisms for data access and management must be combined with rendering algorithms, data transformation, etc. to form a generic visualization pipeline. A generalized approach to data visualization is critical for the correlative analysis of distinct, complex, multidimensional data sets in the space and Earth sciences. Different classes of data representation techniques must be used within such a framework, which can range from simple, static two- and three-dimensional line plots to animation, surface rendering, and volumetric imaging. Static examples of actual data analyses will illustrate the importance of an effective pipeline in data visualization system.

  12. Demonstration of Wavelet Techniques in the Spectral Analysis of Bypass Transition Data

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques; Ashpis, David E.; Sohn, Ki-Hyeon

    1997-01-01

    A number of wavelet-based techniques for the analysis of experimental data are developed and illustrated. A multiscale analysis based on the Mexican hat wavelet is demonstrated as a tool for acquiring physical and quantitative information not obtainable by standard signal analysis methods. Experimental data for the analysis came from simultaneous hot-wire velocity traces in a bypass transition of the boundary layer on a heated flat plate. A pair of traces (two components of velocity) at one location was excerpted. A number of ensemble and conditional statistics related to dominant time scales for energy and momentum transport were calculated. The analysis revealed a lack of energy-dominant time scales inside turbulent spots but identified transport-dominant scales inside spots that account for the largest part of the Reynolds stress. Momentum transport was much more intermittent than were energetic fluctuations. This work is the first step in a continuing study of the spatial evolution of these scale-related statistics, the goal being to apply the multiscale analysis results to improve the modeling of transitional and turbulent industrial flows.

  13. Effects on Hamstring Muscle Extensibility, Muscle Activity, and Balance of Different Stretching Techniques

    PubMed Central

    Lim, Kyoung-Il; Nam, Hyung-Chun; Jung, Kyoung-Sim

    2014-01-01

    [Purpose] The purpose of this study was to investigate the effects of two different stretching techniques on range of motion (ROM), muscle activation, and balance. [Subjects] For the present study, 48 adults with hamstring muscle tightness were recruited and randomly divided into three groups: a static stretching group (n=16), a PNF stretching group (n=16), a control group (n=16). [Methods] Both of the stretching techniques were applied to the hamstring once. Active knee extension angle, muscle activation during maximum voluntary isometric contraction (MVC), and static balance were measured before and after the application of each stretching technique. [Results] Both the static stretching and the PNF stretching groups showed significant increases in knee extension angle compared to the control group. However, there were no significant differences in muscle activation or balance between the groups. [Conclusion] Static stretching and PNF stretching techniques improved ROM without decrease in muscle activation, but neither of them exerted statistically significant effects on balance. PMID:24648633

  14. Cost-effectiveness analysis of implants versus autologous perforator flaps using the BREAST-Q.

    PubMed

    Matros, Evan; Albornoz, Claudia R; Razdan, Shantanu N; Mehrara, Babak J; Macadam, Sheina A; Ro, Teresa; McCarthy, Colleen M; Disa, Joseph J; Cordeiro, Peter G; Pusic, Andrea L

    2015-04-01

    Reimbursement has been recognized as a physician barrier to autologous reconstruction. Autologous reconstructions are more expensive than prosthetic reconstructions, but provide greater health-related quality of life. The authors' hypothesis is that autologous tissue reconstructions are cost-effective compared with prosthetic techniques when considering health-related quality of life and patient satisfaction. A cost-effectiveness analysis from the payer perspective, including patient input, was performed for unilateral and bilateral reconstructions with deep inferior epigastric perforator (DIEP) flaps and implants. The effectiveness measure was derived using the BREAST-Q and interpreted as the cost for obtaining 1 year of perfect breast health-related quality-adjusted life-year. Costs were obtained from the 2010 Nationwide Inpatient Sample. The incremental cost-effectiveness ratio was generated. A sensitivity analysis for age and stage at diagnosis was performed. BREAST-Q scores from 309 patients with implants and 217 DIEP flap reconstructions were included. The additional cost for obtaining 1 year of perfect breast-related health for a unilateral DIEP flap compared with implant reconstruction was $11,941. For bilateral DIEP flaps compared with implant reconstructions, the cost for an additional breast health-related quality-adjusted life-year was $28,017. The sensitivity analysis demonstrated that the cost for an additional breast health-related quality-adjusted life-year for DIEP flaps compared with implants was less for younger patients and earlier stage breast cancer. DIEP flaps are cost-effective compared with implants, especially for unilateral reconstructions. Cost-effectiveness of autologous techniques is maximized in women with longer life expectancy. Patient-reported outcomes findings can be incorporated into cost-effectiveness analyses to demonstrate the relative value of reconstructive procedures.

  15. Effects of finishing/polishing techniques on microleakage of resin-modified glass ilonomer cement restorations.

    PubMed

    Yap, Adrian U J; Yap, W Y; Yeo, Egwin J C; Tan, Jane W S; Ong, Debbie S B

    2003-01-01

    This study investigated the effect of finishing/polishing techniques on the microleakage of resin-modified glass ionomer restorations. Class V preparations were made on the buccal and lingual/palatal surfaces of freshly extracted teeth. The cavities on each tooth were restored with Fuji II LC (FT [GC]) and Photac-Fil Quick (PF [3M-ESPE]) according to manufacturers' instructions. Immediately after light-polymerization, gross finishing was done with eight-fluted tungsten carbide burs. The teeth were then randomly divided into four groups and finishing/polishing was done with one of the following systems: (a) Robot Carbides (RC); (b) Super-Snap system (SS); (c) OneGloss (OG) and (d) CompoSite Polishers (CS). The sample size for each material-finishing/polishing system combination was eight. After finishing/polishing, the teeth were stored in distilled water at 37 degrees C for one week. The root apices were then sealed with acrylic and two coats of varnish was applied 1 mm beyond the restoration margins. The teeth were subsequently subjected to dye penetration testing (0.5% basic fuchsin), sectioned and scored. Data was analyzed using Kruskal-Wallis and Mann-Whitney U tests at a significance level of 0.05. Results of statistical analysis were as follows: Enamel margins: PF-OGtechnique, leakage at dentin margins was significantly greater than at enamel margins for FT. For PF, no significant difference in leakage scores was observed between dentin and enamel with the exception of finishing/polishing with OG. FT restorations had significantly less enamel and dentin leakage than PF restorations when treated with OG. The effect of finishing/polishing techniques on microleakage was both tissue and material dependent.

  16. Effects of myofascial release leg pull and sagittal plane isometric contract-relax techniques on passive straight-leg raise angle.

    PubMed

    Hanten, W P; Chandler, S D

    1994-09-01

    Experimental evidence does not currently exist to support the claims of clinical effectiveness for myofascial release techniques. This presents an obvious need to document the effects of myofascial release. The purpose of this study was to compare the effects of two techniques, sagittal plane isometric contract-relax and myofascial release leg pull for increasing hip flexion range of motion (ROM) as measured by the angle of passive straight-leg raise. Seventy-five nondisabled, female subjects 18-29 years of age were randomly assigned to contract-relax, leg pull, or control groups. Pretest hip flexion ROM was measured for each subject's right hip with a passive straight-leg raise test using a fluid-filled goniometer. Subjects in the treatment groups received either contract-relax or leg pull treatment applied to the right lower extremity; subjects in the control group remained supine quietly for 5 minutes. Following treatment, posttest straight-leg raise measurements were performed. A one-way analysis of variance followed by a Newman-Keuls post hoc comparison of mean gain scores showed that subjects receiving contract-relax treatment increased their ROM significantly more than those who received leg pull treatment, and the increase in ROM of subjects in both treatment groups was significantly higher than those of the control group. The results suggest that while both contract-relax and leg pull techniques can significantly increase hip flexion ROM in normal subjects, contract-relax treatment may be more effective and efficient than leg pull treatment.

  17. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Turbine Bladed Disks

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.

  18. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    NASA Technical Reports Server (NTRS)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  19. Subspace techniques to remove artifacts from EEG: a quantitative analysis.

    PubMed

    Teixeira, A R; Tome, A M; Lang, E W; Martins da Silva, A

    2008-01-01

    In this work we discuss and apply projective subspace techniques to both multichannel as well as single channel recordings. The single-channel approach is based on singular spectrum analysis(SSA) and the multichannel approach uses the extended infomax algorithm which is implemented in the opensource toolbox EEGLAB. Both approaches will be evaluated using artificial mixtures of a set of selected EEG signals. The latter were selected visually to contain as the dominant activity one of the characteristic bands of an electroencephalogram (EEG). The evaluation is performed both in the time and frequency domain by using correlation coefficients and coherence function, respectively.

  20. Effectiveness of Cognitive Behavioral Therapy Techniques for Control of Pain in Lung Cancer Patients: An Integrated Review.

    PubMed

    Phianmongkhol, Yupin; Thongubon, Kannika; Woottiluk, Pakapan

    2015-01-01

    Experience of lung cancer includes negative impacts on both physical and psychological health. Pain is one of the negative experiences of lung cancer. Cognitive behavioral therapy techniques are often recommended as treatments for lung cancer pain. The objective of this review was to synthesize the evidence on the effectiveness of cognitive behavioral therapy techniques in treating lung cancer pain. This review considered studies that included lung cancer patients who were required to 1) be at least 18 years old; 2) speak and read English or Thai; 3) have a life expectancy of at least two months; 4) experience daily cancer pain requiring an opioid medication; 5) have a positive response to opioid medication; 6) have "average or usual" pain between 4 and 7 on a scale of 0-10 for the day before the clinic visit or for a typical day; and 7) able to participate in a pain evaluation and treatment program. This review considered studies to examine interventions for use in treatment of pain in lung cancer patients, including: biofeedback, cognitive/attentional distraction, imagery, hypnosis, and meditation. Any randomized controlled trials (RCTs) that examined cognitive behavioral therapy techniques for pain specifically in lung cancer patients were included. In the absence of RCTs, quasi-experimental designs were reviewed for possible conclusion in a narrative summary. Outcome measures were pain intensity before and after cognitive behavioural therapy techniques. The search strategy aimed to find both published and unpublished literature. A three-step search was utilised by using identified keywords and text term. An initial limited search of MEDLINE and CINAHL was undertaken followed by analysis of the text words contained in the title and abstract, and of the index terms used to describe the article. A second search using all the identified keywords and index terms was then undertaken across all included databases. Thirdly, the reference list of all identified reports