Sample records for traditional analysis techniques

  1. [Applications of near-infrared spectroscopy to analysis of traditional Chinese herbal medicine].

    PubMed

    Li, Yan-Zhou; Min, Shun-Geng; Liu, Xia

    2008-07-01

    Analysis of traditional Chinese herbal medicine is of great importance to its quality control Conventional analysis methods can not meet the requirement of rapid and on-line analysis because of complex process more experiences or needed. In recent years, near-infrared spectroscopy technique has been used for rapid determination of active components, on-line quality control, identification of counterfeit and discrimination of geographical origins of herbal medicines and so on, due to its advantages of simple pretreatment, high efficiency, convenience to use solid diffuse reflection spectroscopy and fiber. In the present paper, the principles and methods of near-infrared spectroscopy technique are introduced concisely. Especially, the applications of this technique in quantitative analysis and qualitative analysis of traditional Chinese herbal medicine are reviewed.

  2. Endoscopic versus traditional saphenous vein harvesting: a prospective, randomized trial.

    PubMed

    Allen, K B; Griffith, G L; Heimansohn, D A; Robison, R J; Matheny, R G; Schier, J J; Fitzgerald, E B; Shaar, C J

    1998-07-01

    Saphenous vein harvested with a traditional longitudinal technique often results in leg wound complications. An alternative endoscopic harvest technique may decrease these complications. One hundred twelve patients scheduled for elective coronary artery bypass grafting were prospectively randomized to have vein harvested using either an endoscopic (group A, n = 54) or traditional technique (group B, n = 58). Groups A and B, respectively, were similar with regard to length of vein harvested (41 +/- 8 cm versus 40 +/- 14 cm), bypasses done (4.1 +/- 1.1 versus 4.2 +/- 1.4), age, preoperative risk stratification, and risks for wound complication (diabetes, sex, obesity, preoperative anemia, hypoalbuminemia, and peripheral vascular disease). Leg wound complications were significantly (p < or = 0.02) reduced in group A (4% [2 of 51] versus 19% [11 of 58]). Univariate analysis identified traditional incision (p < or = 0.02) and diabetes (p < or = 0.05) as wound complication risk factors. Multiple logistic regression analysis identified only the traditional harvest technique as a risk factor for leg wound complications with no significant interaction between harvest technique and any preoperative risk factor (p < or = 0.03). Harvest rate (0.9 +/- 0.4 cm/min versus 1.2 +/- 0.5 cm/min) was slower for group A (p < or = 0.02) and conversion from endoscopic to a traditional harvest occurred in 5.6% (3 of 54) of patients. In a prospective, randomized trial, saphenous vein harvested endoscopically was associated with fewer wound complications than the traditional longitudinal method.

  3. Quality assessment of internet pharmaceutical products using traditional and non-traditional analytical techniques.

    PubMed

    Westenberger, Benjamin J; Ellison, Christopher D; Fussner, Andrew S; Jenney, Susan; Kolinski, Richard E; Lipe, Terra G; Lyon, Robbe C; Moore, Terry W; Revelle, Larry K; Smith, Anjanette P; Spencer, John A; Story, Kimberly D; Toler, Duckhee Y; Wokovich, Anna M; Buhse, Lucinda F

    2005-12-08

    This work investigated the use of non-traditional analytical methods to evaluate the quality of a variety of pharmaceutical products purchased via internet sites from foreign sources and compared the results with those obtained from conventional quality assurance methods. Traditional analytical techniques employing HPLC for potency, content uniformity, chromatographic purity and drug release profiles were used to evaluate the quality of five selected drug products (fluoxetine hydrochloride, levothyroxine sodium, metformin hydrochloride, phenytoin sodium, and warfarin sodium). Non-traditional techniques, such as near infrared spectroscopy (NIR), NIR imaging and thermogravimetric analysis (TGA), were employed to verify the results and investigate their potential as alternative testing methods. Two of 20 samples failed USP monographs for quality attributes. The additional analytical methods found 11 of 20 samples had different formulations when compared to the U.S. product. Seven of the 20 samples arrived in questionable containers, and 19 of 20 had incomplete labeling. Only 1 of the 20 samples had final packaging similar to the U.S. products. The non-traditional techniques complemented the traditional techniques used and highlighted additional quality issues for the products tested. For example, these methods detected suspect manufacturing issues (such as blending), which were not evident from traditional testing alone.

  4. Plasma spectroscopy analysis technique based on optimization algorithms and spectral synthesis for arc-welding quality assurance.

    PubMed

    Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M

    2007-02-19

    A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.

  5. High-performance liquid chromatography coupled with tandem mass spectrometry technology in the analysis of Chinese Medicine Formulas: A bibliometric analysis (1997-2015).

    PubMed

    He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping

    2017-01-01

    There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    NASA Technical Reports Server (NTRS)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  7. Determination of T-2 and HT-2 toxins from maize by direct analysis in real time - mass spectrometry (DART-MS)

    USDA-ARS?s Scientific Manuscript database

    Ambient desorption ionization techniques, such as laser desorption with electrospray ionization assistance (ELDI), direct analysis in real time (DART) and desorption electrospray ionization (DESI) have been developed as alternatives to traditional mass spectrometric-based methods. Such techniques al...

  8. RECENT ADVANCES IN ULTRA-HIGH PERFORMANCE LIQUID CHROMATOGRAPHY FOR THE ANALYSIS OF TRADITIONAL CHINESE MEDICINE

    PubMed Central

    Huang, Huilian; Liu, Min; Chen, Pei

    2014-01-01

    Traditional Chinese medicine has been widely used for the prevention and treatment of various diseases for thousands of years in China. Ultra-high performance liquid chromatography (UHPLC) is a relatively new technique offering new possibilities. This paper reviews recent developments in UHPLC in the separation and identification, fingerprinting, quantification, and metabolism of traditional Chinese medicine. Recently, the combination of UHPLC with MS has improved the efficiency of the analysis of these materials. PMID:25045170

  9. Figure analysis: A teaching technique to promote visual literacy and active Learning.

    PubMed

    Wiles, Amy M

    2016-07-08

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based biology courses. An additional challenge is that visual literacy is often overlooked in undergraduate science education. To address both of these challenges, a technique called figure analysis was developed and implemented in three different levels of undergraduate biology courses. Here, students learn content while gaining practice in interpreting visual information by discussing figures with their peers. Student groups also make connections between new and previously learned concepts on their own while in class. The instructor summarizes the material for the class only after students grapple with it in small groups. Students reported a preference for learning by figure analysis over traditional lecture, and female students in particular reported increased confidence in their analytical abilities. There is not a technology requirement for this technique; therefore, it may be utilized both in classrooms and in nontraditional spaces. Additionally, the amount of preparation required is comparable to that of a traditional lecture. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(4):336-344, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  10. In vivo analysis of insertional torque during pedicle screwing using cortical bone trajectory technique.

    PubMed

    Matsukawa, Keitaro; Yato, Yoshiyuki; Kato, Takashi; Imabayashi, Hideaki; Asazuma, Takashi; Nemoto, Koichi

    2014-02-15

    The insertional torque of pedicle screws using the cortical bone trajectory (CBT) was measured in vivo. To investigate the effectiveness of the CBT technique by measurement of the insertional torque. The CBT follows a mediolateral and caudocephalad directed path, engaging with cortical bone maximally from the pedicle to the vertebral body. Some biomechanical studies have demonstrated favorable characteristics of the CBT technique in cadaveric lumbar spine. However, no in vivo study has been reported on the mechanical behavior of this new trajectory. The insertional torque of pedicle screws using CBT and traditional techniques were measured intraoperatively in 48 consecutive patients. A total of 162 screws using the CBT technique and 36 screws using the traditional technique were compared. In 8 of 48 patients, the side-by-side comparison of 2 different insertional techniques for each vertebra were performed, which formed the H group. In addition, the insertional torque was correlated with bone mineral density. The mean maximum insertional torque of CBT screws and traditional screws were 2.49 ± 0.99 Nm and 1.24 ± 0.54 Nm, respectively. The CBT screws showed 2.01 times higher torque and the difference was significant between the 2 techniques (P < 0.01). In the H group, the insertional torque were 2.71 ± 1.36 Nm in the CBT screws and 1.58 ± 0.44 Nm in the traditional screws. The CBT screws demonstrated 1.71 times higher torque and statistical significance was achieved (P < 0.01). Positive linear correlations between maximum insertional torque and bone mineral density were found in both technique, the correlation coefficient of traditional screws (r = 0.63, P < 0.01) was higher than that of the CBT screws (r = 0.59, P < 0.01). The insertional torque using the CBT technique is about 1.7 times higher than the traditional technique. 2.

  11. Quality evaluation of fish and other seafood by traditional and nondestructive instrumental methods: Advantages and limitations.

    PubMed

    Hassoun, Abdo; Karoui, Romdhane

    2017-06-13

    Although being one of the most vulnerable and perishable products, fish and other seafoods provide a wide range of health-promoting compounds. Recently, the growing interest of consumers in food quality and safety issues has contributed to the increasing demand for sensitive and rapid analytical technologies. Several traditional physicochemical, textural, sensory, and electrical methods have been used to evaluate freshness and authentication of fish and other seafood products. Despite the importance of these standard methods, they are expensive and time-consuming, and often susceptible to large sources of variation. Recently, spectroscopic methods and other emerging techniques have shown great potential due to speed of analysis, minimal sample preparation, high repeatability, low cost, and, most of all, the fact that these techniques are noninvasive and nondestructive and, therefore, could be applied to any online monitoring system. This review describes firstly and briefly the basic principles of multivariate data analysis, followed by the most commonly traditional methods used for the determination of the freshness and authenticity of fish and other seafood products. A special focus is put on the use of rapid and nondestructive techniques (spectroscopic techniques and instrumental sensors) to address several issues related to the quality of these products. Moreover, the advantages and limitations of each technique are reviewed and some perspectives are also given.

  12. Recent advances in ultra-high performance liquid chromatography for the analysis of traditional chinese medicine

    USDA-ARS?s Scientific Manuscript database

    Traditional Chinese medicines (TCMs) have been widely used for the prevention and treatment of various diseases for thousands of years in China. Ultra Performance Liquid Chromatography (UHPLC) is a relatively new technique offering new possibilities in liquid chromatography. This paper reviews recen...

  13. Characterization of Ultra-fine Grained and Nanocrystalline Materials Using Transmission Kikuchi Diffraction

    PubMed Central

    Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine

    2017-01-01

    One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized. PMID:28447998

  14. Characterization of Ultra-fine Grained and Nanocrystalline Materials Using Transmission Kikuchi Diffraction.

    PubMed

    Proust, Gwénaëlle; Trimby, Patrick; Piazolo, Sandra; Retraint, Delphine

    2017-04-01

    One of the challenges in microstructure analysis nowadays resides in the reliable and accurate characterization of ultra-fine grained (UFG) and nanocrystalline materials. The traditional techniques associated with scanning electron microscopy (SEM), such as electron backscatter diffraction (EBSD), do not possess the required spatial resolution due to the large interaction volume between the electrons from the beam and the atoms of the material. Transmission electron microscopy (TEM) has the required spatial resolution. However, due to a lack of automation in the analysis system, the rate of data acquisition is slow which limits the area of the specimen that can be characterized. This paper presents a new characterization technique, Transmission Kikuchi Diffraction (TKD), which enables the analysis of the microstructure of UFG and nanocrystalline materials using an SEM equipped with a standard EBSD system. The spatial resolution of this technique can reach 2 nm. This technique can be applied to a large range of materials that would be difficult to analyze using traditional EBSD. After presenting the experimental set up and describing the different steps necessary to realize a TKD analysis, examples of its use on metal alloys and minerals are shown to illustrate the resolution of the technique and its flexibility in term of material to be characterized.

  15. Introducing Students to Protein Analysis Techniques: Separation and Comparative Analysis of Gluten Proteins in Various Wheat Strains

    ERIC Educational Resources Information Center

    Pirinelli, Alyssa L.; Trinidad, Jonathan C.; Pohl, Nicola L. B.

    2016-01-01

    Polyacrylamide gel electrophoresis (PAGE) is commonly taught in undergraduate laboratory classes as a traditional method to analyze proteins. An experiment has been developed to teach these basic protein gel skills in the context of gluten protein isolation from various types of wheat flour. A further goal is to relate this technique to current…

  16. The Japan Society for Innovative Cuisine: Exploring New Visions of Japanese Cuisine.

    PubMed

    Yamazaki, Hanae; Fushiki, Tohru

    2015-01-01

    Kyoto cuisine has a long history and its traditions have been practiced for hundreds of years. In Kyoto, a group of scientists and renowned chefs strives to better understand traditional Kyoto cuisine in order to foster culinary innovation within traditional Kyoto cuisine. We launched a research project in April 2009 using a specially equipped "laboratory-kitchen" located in Kyoto University. Chefs chose a variety of topics related to basic concepts and techniques for cooking. We conducted culinary experimentation, thorough analysis, and diligent discussion on each topic for approximately 6 mo. In the symposium, chefs will present the results of their experiments, discussing their techniques and bringing samples of final products.

  17. What defines an Expert? - Uncertainty in the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Bond, C. E.

    2008-12-01

    Studies focusing on the elicitation of information from experts are concentrated primarily in economics and world markets, medical practice and expert witness testimonies. Expert elicitation theory has been applied in the natural sciences, most notably in the prediction of fluid flow in hydrological studies. In the geological sciences expert elicitation has been limited to theoretical analysis with studies focusing on the elicitation element, gaining expert opinion rather than necessarily understanding the basis behind the expert view. In these cases experts are defined in a traditional sense, based for example on: standing in the field, no. of years of experience, no. of peer reviewed publications, the experts position in a company hierarchy or academia. Here traditional indicators of expertise have been compared for significance on affective seismic interpretation. Polytomous regression analysis has been used to assess the relative significance of length and type of experience on the outcome of a seismic interpretation exercise. Following the initial analysis the techniques used by participants to interpret the seismic image were added as additional variables to the analysis. Specific technical skills and techniques were found to be more important for the affective geological interpretation of seismic data than the traditional indicators of expertise. The results of a seismic interpretation exercise, the techniques used to interpret the seismic and the participant's prior experience have been combined and analysed to answer the question - who is and what defines an expert?

  18. Figure Analysis: A Teaching Technique to Promote Visual Literacy and Active Learning

    ERIC Educational Resources Information Center

    Wiles, Amy M.

    2016-01-01

    Learning often improves when active learning techniques are used in place of traditional lectures. For many of these techniques, however, students are expected to apply concepts that they have already grasped. A challenge, therefore, is how to incorporate active learning into the classroom of courses with heavy content, such as molecular-based…

  19. The patient relationship and therapeutic techniques of the South Sotho traditional healer.

    PubMed

    Pinkoane, M G; Greeff, M; Williams, M J S

    2005-11-01

    Until 1996 the practice of traditional healers was outlawed in South Africa and not afforded a legal position in the community of health care providers. In 1978 the World Health Organization (WHO) identified traditional healers as those people forming an essential core of primary health care workers for rural people in the Third World Countries. However in 1994 the new South African government identified traditional healers as forming an essential element of primary health care workers. It is estimated that 80% of the black population uses traditional medicine because it is deeply rooted in their culture, which is linked to their religion. The traditional healer shares with the patient a world view which is completely alien to biomedical personnel. Therapeutic techniques typically used in traditional healing conflict with the therapeutic techniques used in biomedicine. The patients' perceptions of traditional healing, their needs and expectations, may be the driving force behind their continuous persistence to consult a traditional healer, even after these patients may have sought the therapeutic techniques of biomedical personnel. The operation of both systems in the same society creates a problem to both providers and recipients of health care. Confusion then arises and the consumer consequently chooses the services closer to her. The researcher aimed at investigating the characteristics of the relationship between the traditional healers and the patients, explored the therapeutic techniques that are used in the South Sotho traditional healing process, and investigated the views of both the traditional healers and the patients about the South -Sotho traditional healing process, to facilitate incorporation of the traditional healers in the National Health Care Delivery System. A qualitative research design was followed. Participants were identified by means of a non-probable, purposive voluntary sample. Data was collected by means of a video camera and semi-structured interviews with the six traditional healers and twelve patients, as well as by taking field notes after each session. Data analysis was achieved by means of using a checklist for the video recordings, and decoding was done for the interviews. A co-coder and the researcher analysed the data independently, after which three consensus discussions took place to finalise the analysed data. The researcher made conclusions, identified shortcomings, and made recommendations for application to nursing education, nursing research and nursing practice. The recommendations for nursing are reflected in the form of guidelines for the incorporation of the traditional healers in the National Health Care Delivery System.

  20. An Introduction to Modern Missing Data Analyses

    ERIC Educational Resources Information Center

    Baraldi, Amanda N.; Enders, Craig K.

    2010-01-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…

  1. Multiscale analysis of heart rate dynamics: entropy and time irreversibility measures.

    PubMed

    Costa, Madalena D; Peng, Chung-Kang; Goldberger, Ary L

    2008-06-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and non-equilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools--multiscale entropy and multiscale time irreversibility--are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs.

  2. Multiscale Analysis of Heart Rate Dynamics: Entropy and Time Irreversibility Measures

    PubMed Central

    Peng, Chung-Kang; Goldberger, Ary L.

    2016-01-01

    Cardiovascular signals are largely analyzed using traditional time and frequency domain measures. However, such measures fail to account for important properties related to multiscale organization and nonequilibrium dynamics. The complementary role of conventional signal analysis methods and emerging multiscale techniques, is, therefore, an important frontier area of investigation. The key finding of this presentation is that two recently developed multiscale computational tools— multiscale entropy and multiscale time irreversibility—are able to extract information from cardiac interbeat interval time series not contained in traditional methods based on mean, variance or Fourier spectrum (two-point correlation) techniques. These new methods, with careful attention to their limitations, may be useful in diagnostics, risk stratification and detection of toxicity of cardiac drugs. PMID:18172763

  3. Surgical accuracy with the mini-subvastus total knee arthroplasty a computer tomography scan analysis of postoperative implant alignment.

    PubMed

    Schroer, William C; Diesfeld, Paul J; Reedy, Mary E; Lemarr, Angela R

    2008-06-01

    A total of 50 total knee arthroplasty (TKA) patients, 25 traditional and 25 minimally invasive surgical (MIS), underwent computed tomography scans to determine if a loss of accuracy in implant alignment occurred when a surgeon switched from a traditional medial parapatellar arthrotomy to a mini-subvastus surgical technique. Surgical accuracy was determined by comparing the computed tomography measured implant alignment with the surgical alignment goals. There was no loss in accuracy in the implantation of the tibial component with the mini-subvastus technique. The mean variance for the tibial coronal alignment was 1.03 degrees for the traditional TKA and 1.00 degrees for the MIS TKA (P = .183). Similarly, there was no difference in the mean variance for the posterior tibial slope (P = .054). Femoral coronal alignment was less accurate with the MIS procedure, mean variance of 1.04 degrees and 1.71 degrees for the traditional and MIS TKA, respectively (P = .045). Instrumentation and surgical technique concerns that led to this loss in accuracy were determined.

  4. A study for high accuracy measurement of residual stress by deep hole drilling technique

    NASA Astrophysics Data System (ADS)

    Kitano, Houichi; Okano, Shigetaka; Mochizuki, Masahito

    2012-08-01

    The deep hole drilling technique (DHD) received much attention in recent years as a method for measuring through-thickness residual stresses. However, some accuracy problems occur when residual stress evaluation is performed by the DHD technique. One of the reasons is that the traditional DHD evaluation formula applies to the plane stress condition. The second is that the effects of the plastic deformation produced in the drilling process and the deformation produced in the trepanning process are ignored. In this study, a modified evaluation formula, which is applied to the plane strain condition, is proposed. In addition, a new procedure is proposed which can consider the effects of the deformation produced in the DHD process by investigating the effects in detail by finite element (FE) analysis. Then, the evaluation results obtained by the new procedure are compared with that obtained by traditional DHD procedure by FE analysis. As a result, the new procedure evaluates the residual stress fields better than the traditional DHD procedure when the measuring object is thick enough that the stress condition can be assumed as the plane strain condition as in the model used in this study.

  5. Meta-Analysis: Application to Clinical Dentistry and Dental Education.

    ERIC Educational Resources Information Center

    Cohen, Peter A.

    1992-01-01

    Meta-analysis is proposed as an effective alternative to conventional narrative review for extracting trends from research findings. This type of analysis is explained, advantages over more traditional review techniques are discussed, basic procedures and limitations are outlined, and potential applications in dental education and clinical…

  6. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  7. Forecasting--A Systematic Modeling Methodology. Paper No. 489.

    ERIC Educational Resources Information Center

    Mabert, Vincent A.; Radcliffe, Robert C.

    In an attempt to bridge the gap between academic understanding and practical business use, the Box-Jenkins technique of time series analysis for forecasting future events is presented with a minimum of mathematical notation. The method is presented in three stages: a discussion of traditional forecasting techniques, focusing on traditional…

  8. Real time automatic detection of bearing fault in induction machine using kurtogram analysis.

    PubMed

    Tafinine, Farid; Mokrani, Karim

    2012-11-01

    A proposed signal processing technique for incipient real time bearing fault detection based on kurtogram analysis is presented in this paper. The kurtogram is a fourth-order spectral analysis tool introduced for detecting and characterizing non-stationarities in a signal. This technique starts from investigating the resonance signatures over selected frequency bands to extract the representative features. The traditional spectral analysis is not appropriate for non-stationary vibration signal and for real time diagnosis. The performance of the proposed technique is examined by a series of experimental tests corresponding to different bearing conditions. Test results show that this signal processing technique is an effective bearing fault automatic detection method and gives a good basis for an integrated induction machine condition monitor.

  9. Presentation-Oriented Visualization Techniques.

    PubMed

    Kosara, Robert

    2016-01-01

    Data visualization research focuses on data exploration and analysis, yet the vast majority of visualizations people see were created for a different purpose: presentation. Whether we are talking about charts showing data to help make a presenter's point, data visuals created to accompany a news story, or the ubiquitous infographics, many more people consume charts than make them. Traditional visualization techniques treat presentation as an afterthought, but are there techniques uniquely suited to data presentation but not necessarily ideal for exploration and analysis? This article focuses on presentation-oriented techniques, considering their usefulness for presentation first and any other purposes as secondary.

  10. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  11. Persuasion Analysis: A Companion to Composition.

    ERIC Educational Resources Information Center

    Rank, Hugh

    Paying less attention to the traditionally taught rational, logical argument analysis format, this book focuses on analysis of the emotional, non-logical persuasive language and techniques often seen in television advertisements. In so doing, readers become more discerning consumers and hone their writing skills. Designed as both a self-study…

  12. Comparative analysis of profitability of honey production using traditional and box hives.

    PubMed

    Al-Ghamdi, Ahmed A; Adgaba, Nuru; Herab, Ahmed H; Ansari, Mohammad J

    2017-07-01

    Information on the profitability and productivity of box hives is important to encourage beekeepers to adopt the technology. However, comparative analysis of profitability and productivity of box and traditional hives is not adequately available. The study was carried out on 182 beekeepers using cross sectional survey and employing a random sampling technique. The data were analyzed using descriptive statistics, analysis of variance (ANOVA), the Cobb-Douglas (CD) production function and partial budgeting. The CD production function revealed that supplementary bee feeds, labor and medication were statistically significant for both box and traditional hives. Generally, labor for bee management, supplementary feeding, and medication led to productivity differences of approximately 42.83%, 7.52%, and 5.34%, respectively, between box and traditional hives. The study indicated that productivity of box hives were 72% higher than traditional hives. The average net incomes of beekeepers using box and traditional hives were 33,699.7 SR/annum and 16,461.4 SR/annum respectively. The incremental net benefit of box hives over traditional hives was nearly double. Our study results clearly showed the importance of adoption of box hives for better productivity of the beekeeping subsector.

  13. A rapid method for the sampling of atmospheric water vapour for isotopic analysis.

    PubMed

    Peters, Leon I; Yakir, Dan

    2010-01-01

    Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.

  14. Lightweight and Statistical Techniques for Petascale PetaScale Debugging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Barton

    2014-06-30

    This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less

  15. Fast discrimination of traditional Chinese medicine according to geographical origins with FTIR spectroscopy and advanced pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Li, Ning; Wang, Yan; Xu, Kexin

    2006-08-01

    Combined with Fourier transform infrared (FTIR) spectroscopy and three kinds of pattern recognition techniques, 53 traditional Chinese medicine danshen samples were rapidly discriminated according to geographical origins. The results showed that it was feasible to discriminate using FTIR spectroscopy ascertained by principal component analysis (PCA). An effective model was built by employing the Soft Independent Modeling of Class Analogy (SIMCA) and PCA, and 82% of the samples were discriminated correctly. Through use of the artificial neural network (ANN)-based back propagation (BP) network, the origins of danshen were completely classified.

  16. Effectiveness of Video Demonstration over Conventional Methods in Teaching Osteology in Anatomy.

    PubMed

    Viswasom, Angela A; Jobby, Abraham

    2017-02-01

    Technology and its applications are the most happening things in the world. So, is it in the field of medical education. This study was an evaluation of whether the conventional methods can compete with the test of technology. A comparative study of traditional method of teaching osteology in human anatomy with an innovative visual aided method. The study was conducted on 94 students admitted to MBBS 2014 to 2015 batch of Travancore Medical College. The students were divided into two academically validated groups. They were taught using conventional and video demonstrational techniques in a systematic manner. Post evaluation tests were conducted. Analysis of the mark pattern revealed that the group taught using traditional method scored better when compared to the visual aided method. Feedback analysis showed that, the students were able to identify bony features better with clear visualisation and three dimensional view when taught using the video demonstration method. The students identified visual aided method as the more interesting one for learning which helped them in applying the knowledge gained. In most of the questions asked, the two methods of teaching were found to be comparable on the same scale. As the study ends, we discover that, no new technique can be substituted for time tested techniques of teaching and learning. The ideal method would be incorporating newer multimedia techniques into traditional classes.

  17. Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.

    PubMed

    Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E

    2017-01-01

    Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.

  18. Edge enhancement and noise suppression for infrared image based on feature analysis

    NASA Astrophysics Data System (ADS)

    Jiang, Meng

    2018-06-01

    Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.

  19. The holistic analysis of gamma-ray spectra in instrumental neutron activation analysis

    NASA Astrophysics Data System (ADS)

    Blaauw, Menno

    1994-12-01

    A method for the interpretation of γ-ray spectra as obtained in INAA using linear least squares techniques is described. Results obtained using this technique and the traditional method previously in use at IRI are compared. It is concluded that the method presented performs better with respect to the number of detected elements, the resolution of interferences and the estimation of the accuracies of the reported element concentrations. It is also concluded that the technique is robust enough to obviate the deconvolution of multiplets.

  20. Increasing Effectiveness in Teaching Ethics to Undergraduate Business Students.

    ERIC Educational Resources Information Center

    Lampe, Marc

    1997-01-01

    Traditional approaches to teaching business ethics (philosophical analysis, moral quandaries, executive cases) may not be effective in persuading undergraduates of the importance of ethical behavior. Better techniques include values education, ethical decision-making models, analysis of ethical conflicts, and role modeling. (SK)

  1. An Historical Perspective on the Theory and Practice of Soil Mechanical Analysis.

    ERIC Educational Resources Information Center

    Miller, W. P.; And Others

    1988-01-01

    Traces the history of soil mechanical analysis. Evaluates this history in order to place current concepts in perspective, from both a research and teaching viewpoint. Alternatives to traditional separation techniques for use in soils teaching laboratories are discussed. (TW)

  2. 21ST CENTURY MOLD ANALYSIS IN FOOD

    EPA Science Inventory

    Traditionally, the indoor air community has relied on mold analysis performed by either microscopic observations or the culturing of molds on various media to assess indoor air quality. These techniques were developed in the 19th century and are very laborious and time consumin...

  3. Modelling and multi objective optimization of WEDM of commercially Monel super alloy using evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Varun, Sajja; Reddy, Kalakada Bhargav Bal; Vardhan Reddy, R. R. Vishnu

    2016-09-01

    In this research work, development of a multi response optimization technique has been undertaken, using traditional desirability analysis and non-traditional particle swarm optimization techniques (for different customer's priorities) in wire electrical discharge machining (WEDM). Monel 400 has been selected as work material for experimentation. The effect of key process parameters such as pulse on time (TON), pulse off time (TOFF), peak current (IP), wire feed (WF) were on material removal rate (MRR) and surface roughness(SR) in WEDM operation were investigated. Further, the responses such as MRR and SR were modelled empirically through regression analysis. The developed models can be used by the machinists to predict the MRR and SR over a wide range of input parameters. The optimization of multiple responses has been done for satisfying the priorities of multiple users by using Taguchi-desirability function method and particle swarm optimization technique. The analysis of variance (ANOVA) is also applied to investigate the effect of influential parameters. Finally, the confirmation experiments were conducted for the optimal set of machining parameters, and the betterment has been proved.

  4. Visualization techniques for tongue analysis in traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Pham, Binh L.; Cai, Yang

    2004-05-01

    Visual inspection of the tongue has been an important diagnostic method of Traditional Chinese Medicine (TCM). Clinic data have shown significant connections between various viscera cancers and abnormalities in the tongue and the tongue coating. Visual inspection of the tongue is simple and inexpensive, but the current practice in TCM is mainly experience-based and the quality of the visual inspection varies between individuals. The computerized inspection method provides quantitative models to evaluate color, texture and surface features on the tongue. In this paper, we investigate visualization techniques and processes to allow interactive data analysis with the aim to merge computerized measurements with human expert's diagnostic variables based on five-scale diagnostic conditions: Healthy (H), History Cancers (HC), History of Polyps (HP), Polyps (P) and Colon Cancer (C).

  5. Investigating cardiorespiratory interaction by cross-spectral analysis of event series

    NASA Astrophysics Data System (ADS)

    Schäfer, Carsten; Rosenblum, Michael G.; Pikovsky, Arkady S.; Kurths, Jürgen

    2000-02-01

    The human cardiovascular and respiratory systems interact with each other and show effects of modulation and synchronization. Here we present a cross-spectral technique that specifically considers the event-like character of the heartbeat and avoids typical restrictions of other spectral methods. Using models as well as experimental data, we demonstrate how modulation and synchronization can be distinguished. Finally, we compare the method to traditional techniques and to the analysis of instantaneous phases.

  6. An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques

    ERIC Educational Resources Information Center

    de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.

    2017-01-01

    This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…

  7. A Comparison of Traditional, Step-Path, and Geostatistical Techniques in the Stability Analysis of a Large Open Pit

    NASA Astrophysics Data System (ADS)

    Mayer, J. M.; Stead, D.

    2017-04-01

    With the increased drive towards deeper and more complex mine designs, geotechnical engineers are often forced to reconsider traditional deterministic design techniques in favour of probabilistic methods. These alternative techniques allow for the direct quantification of uncertainties within a risk and/or decision analysis framework. However, conventional probabilistic practices typically discretize geological materials into discrete, homogeneous domains, with attributes defined by spatially constant random variables, despite the fact that geological media display inherent heterogeneous spatial characteristics. This research directly simulates this phenomenon using a geostatistical approach, known as sequential Gaussian simulation. The method utilizes the variogram which imposes a degree of controlled spatial heterogeneity on the system. Simulations are constrained using data from the Ok Tedi mine site in Papua New Guinea and designed to randomly vary the geological strength index and uniaxial compressive strength using Monte Carlo techniques. Results suggest that conventional probabilistic techniques have a fundamental limitation compared to geostatistical approaches, as they fail to account for the spatial dependencies inherent to geotechnical datasets. This can result in erroneous model predictions, which are overly conservative when compared to the geostatistical results.

  8. RIPARIAN CHARACTERIZATION USING SUB-PIXEL ANALYSIS OF LANDSAT TM IMAGERY FOR USE IN ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    Landuse/land cover and riparian corridor characterization for 7 major watersheds in western Ohio was accomplished using sub-pixel analysis and traditional classification techniques. Areas
    representing forest, woodland, shrub, and herbaceous vegetation were delineated using a ...

  9. Performance evaluation of the RITG148+ set of TomoTherapy quality assurance tools using RTQA2 radiochromic film.

    PubMed

    Lobb, Eric C

    2016-07-08

    Version 6.3 of the RITG148+ software package offers eight automated analysis routines for quality assurance of the TomoTherapy platform. A performance evaluation of each routine was performed in order to compare RITG148+ results with traditionally accepted analysis techniques and verify that simulated changes in machine parameters are correctly identified by the software. Reference films were exposed according to AAPM TG-148 methodology for each routine and the RITG148+ results were compared with either alternative software analysis techniques or manual analysis techniques in order to assess baseline agreement. Changes in machine performance were simulated through translational and rotational adjustments to subsequently irradiated films, and these films were analyzed to verify that the applied changes were accurately detected by each of the RITG148+ routines. For the Hounsfield unit routine, an assessment of the "Frame Averaging" functionality and the effects of phantom roll on the routine results are presented. All RITG148+ routines reported acceptable baseline results consistent with alternative analysis techniques, with 9 of the 11 baseline test results showing agreement of 0.1mm/0.1° or better. Simulated changes were correctly identified by the RITG148+ routines within approximately 0.2 mm/0.2° with the exception of the Field Centervs. Jaw Setting routine, which was found to have limited accuracy in cases where field centers were not aligned for all jaw settings due to inaccurate autorotation of the film during analysis. The performance of the RITG148+ software package was found to be acceptable for introduction into our clinical environment as an automated alternative to traditional analysis techniques for routine TomoTherapy quality assurance testing.

  10. An unsupervised classification technique for multispectral remote sensing data.

    NASA Technical Reports Server (NTRS)

    Su, M. Y.; Cummings, R. E.

    1973-01-01

    Description of a two-part clustering technique consisting of (a) a sequential statistical clustering, which is essentially a sequential variance analysis, and (b) a generalized K-means clustering. In this composite clustering technique, the output of (a) is a set of initial clusters which are input to (b) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum-likelihood classification techniques.

  11. Metamodels for Computer-Based Engineering Design: Survey and Recommendations

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.

  12. Detecting subtle hydrochemical anomalies with multivariate statistics: an example from homogeneous groundwaters in the Great Artesian Basin, Australia

    NASA Astrophysics Data System (ADS)

    O'Shea, Bethany; Jankowski, Jerzy

    2006-12-01

    The major ion composition of Great Artesian Basin groundwater in the lower Namoi River valley is relatively homogeneous in chemical composition. Traditional graphical techniques have been combined with multivariate statistical methods to determine whether subtle differences in the chemical composition of these waters can be delineated. Hierarchical cluster analysis and principal components analysis were successful in delineating minor variations within the groundwaters of the study area that were not visually identified in the graphical techniques applied. Hydrochemical interpretation allowed geochemical processes to be identified in each statistically defined water type and illustrated how these groundwaters differ from one another. Three main geochemical processes were identified in the groundwaters: ion exchange, precipitation, and mixing between waters from different sources. Both statistical methods delineated an anomalous sample suspected of being influenced by magmatic CO2 input. The use of statistical methods to complement traditional graphical techniques for waters appearing homogeneous is emphasized for all investigations of this type. Copyright

  13. Large space antennas: A systems analysis case history

    NASA Technical Reports Server (NTRS)

    Keafer, Lloyd S. (Compiler); Lovelace, U. M. (Compiler)

    1987-01-01

    The value of systems analysis and engineering is aptly demonstrated by the work on Large Space Antennas (LSA) by the NASA Langley Spacecraft Analysis Branch. This work was accomplished over the last half-decade by augmenting traditional system engineering, analysis, and design techniques with computer-aided engineering (CAE) techniques using the Langley-developed Interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system. This report chronicles the research highlights and special systems analyses that focused the LSA work on deployable truss antennas. It notes developmental trends toward greater use of CAE techniques in their design and analysis. A look to the future envisions the application of improved systems analysis capabilities to advanced space systems such as an advanced space station or to lunar and Martian missions and human habitats.

  14. Real-time continuous visual biofeedback in the treatment of speech breathing disorders following childhood traumatic brain injury: report of one case.

    PubMed

    Murdoch, B E; Pitt, G; Theodoros, D G; Ward, E C

    1999-01-01

    The efficacy of traditional and physiological biofeedback methods for modifying abnormal speech breathing patterns was investigated in a child with persistent dysarthria following severe traumatic brain injury (TBI). An A-B-A-B single-subject experimental research design was utilized to provide the subject with two exclusive periods of therapy for speech breathing, based on traditional therapy techniques and physiological biofeedback methods, respectively. Traditional therapy techniques included establishing optimal posture for speech breathing, explanation of the movement of the respiratory muscles, and a hierarchy of non-speech and speech tasks focusing on establishing an appropriate level of sub-glottal air pressure, and improving the subject's control of inhalation and exhalation. The biofeedback phase of therapy utilized variable inductance plethysmography (or Respitrace) to provide real-time, continuous visual biofeedback of ribcage circumference during breathing. As in traditional therapy, a hierarchy of non-speech and speech tasks were devised to improve the subject's control of his respiratory pattern. Throughout the project, the subject's respiratory support for speech was assessed both instrumentally and perceptually. Instrumental assessment included kinematic and spirometric measures, and perceptual assessment included the Frenchay Dysarthria Assessment, Assessment of Intelligibility of Dysarthric Speech, and analysis of a speech sample. The results of the study demonstrated that real-time continuous visual biofeedback techniques for modifying speech breathing patterns were not only effective, but superior to the traditional therapy techniques for modifying abnormal speech breathing patterns in a child with persistent dysarthria following severe TBI. These results show that physiological biofeedback techniques are potentially useful clinical tools for the remediation of speech breathing impairment in the paediatric dysarthric population.

  15. Infusing Reliability Techniques into Software Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  16. Techniques for forced response involving discrete nonlinearities. I - Theory. II - Applications

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; Callahan, John O.

    Several new techniques developed for the forced response analysis of systems containing discrete nonlinear connection elements are presented and compared to the traditional methods. In particular, the techniques examined are the Equivalent Reduced Model Technique (ERMT), Modal Modification Response Technique (MMRT), and Component Element Method (CEM). The general theory of the techniques is presented, and applications are discussed with particular reference to the beam nonlinear system model using ERMT, MMRT, and CEM; frame nonlinear response using the three techniques; and comparison of the results obtained by using the ERMT, MMRT, and CEM models.

  17. Investigating Cultural Evolution Using Phylogenetic Analysis: The Origins and Descent of the Southeast Asian Tradition of Warp Ikat Weaving

    PubMed Central

    Buckley, Christopher D.

    2012-01-01

    The warp ikat method of making decorated textiles is one of the most geographically widespread in southeast Asia, being used by Austronesian peoples in Indonesia, Malaysia and the Philippines, and Daic peoples on the Asian mainland. In this study a dataset consisting of the decorative characters of 36 of these warp ikat weaving traditions is investigated using Bayesian and Neighbornet techniques, and the results are used to construct a phylogenetic tree and taxonomy for warp ikat weaving in southeast Asia. The results and analysis show that these diverse traditions have a common ancestor amongst neolithic cultures the Asian mainland, and parallels exist between the patterns of textile weaving descent and linguistic phylogeny for the Austronesian group. Ancestral state analysis is used to reconstruct some of the features of the ancestral weaving tradition. The widely held theory that weaving motifs originated in the late Bronze Age Dong-Son culture is shown to be inconsistent with the data. PMID:23272211

  18. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  19. The use of a resource-based relative value scale (RBRVS) to determine practice expense costs: a novel technique of practice management for the vascular surgeon.

    PubMed

    Mabry, C D

    2001-03-01

    Vascular surgeons have had to contend with rising costs while their reimbursements have undergone steady reductions. The use of newer accounting techniques can help vascular surgeons better manage their practices, plan for future expansion, and control costs. This article reviews traditional accounting methods, together with activity-based costing (ABC) principles that have been used in the past for practice expense analysis. The main focus is on a new technique-resource-based costing (RBC)-which uses the widely available Resource-Based Relative Value Scale (RBRVS) as its basis. The RBC technique promises easier implementation as well as more flexibility in determining true costs of performing various procedures, as opposed to more traditional accounting methods. It is hoped that RBC will assist vascular surgeons in coping with decreasing reimbursement. Copyright 2001 by W.B. Saunders Company

  20. A Coordinated Focused Ion Beam/Ultramicrotomy Technique for Serial Sectioning of Hayabusa Particles and Other Returned Samples

    NASA Technical Reports Server (NTRS)

    Berger, E. L.; Keller, L. P.

    2014-01-01

    Recent sample return missions, such as NASA's Stardust mission to comet 81P/Wild 2 and JAXA's Hayabusa mission to asteroid 25143 Itokawa, have returned particulate samples (typically 5-50 µm) that pose tremendous challenges to coordinated analysis using a variety of nano- and micro-beam techniques. The ability to glean maximal information from individual particles has become increasingly important and depends critically on how the samples are prepared for analysis. This also holds true for other extraterrestrial materials, including interplanetary dust particles, micrometeorites and lunar regolith grains. Traditionally, particulate samples have been prepared using microtomy techniques (e.g., [1]). However, for hard mineral particles ?20 µm, microtome thin sections are compromised by severe chatter and sample loss. For these difficult samples, we have developed a hybrid technique that combines traditional ultramicrotomy with focused ion beam (FIB) techniques, allowing for the in situ investigation of grain surfaces and interiors. Using this method, we have increased the number of FIB-SEM prepared sections that can be recovered from a particle with dimensions on the order of tens of µms. These sections can be subsequently analyzed using a variety of electron beam techniques. Here, we demonstrate this sample preparation technique on individual lunar regolith grains in order to study their space-weathered surfaces. We plan to extend these efforts to analyses of individual Hayabusa samples.

  1. Seventy-meter antenna performance predictions: GTD analysis compared with traditional ray-tracing methods

    NASA Technical Reports Server (NTRS)

    Schredder, J. M.

    1988-01-01

    A comparative analysis was performed, using both the Geometrical Theory of Diffraction (GTD) and traditional pathlength error analysis techniques, for predicting RF antenna gain performance and pointing corrections. The NASA/JPL 70 meter antenna with its shaped surface was analyzed for gravity loading over the range of elevation angles. Also analyzed were the effects of lateral and axial displacements of the subreflector. Significant differences were noted between the predictions of the two methods, in the effect of subreflector displacements, and in the optimal subreflector positions to focus a gravity-deformed main reflector. The results are of relevance to future design procedure.

  2. Foliar nutrient analysis of sugar maple decline: retrospective vector diagnosis

    Treesearch

    Victor R. Timmer; Yuanxin Teng

    1999-01-01

    Accuracy of traditional foiiar analysis of nutrient disorders in sugar maple (Acer saccharum Marsh) is limited by lack of validation and confounding by nutrient interactions. Vector nutrient diagnosis is relatively free of these problems. The technique is demonstrated retrospectively on four case studies. Diagnostic interpretations consistently...

  3. Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine

    PubMed Central

    Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit

    2017-01-01

    In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804

  4. Doing That Thing That Scientists Do: A Discovery-Driven Module on Protein Purification and Characterization for the Undergraduate Biochemistry Laboratory Classroom

    ERIC Educational Resources Information Center

    Garrett, Teresa A.; Osmundson, Joseph; Isaacson, Marisa; Herrera, Jennifer

    2015-01-01

    In traditional introductory biochemistry laboratory classes students learn techniques for protein purification and analysis by following provided, established, step-by-step procedures. Students are exposed to a variety of biochemical techniques but are often not developing procedures or collecting new, original data. In this laboratory module,…

  5. Neural net diagnostics for VLSI test

    NASA Technical Reports Server (NTRS)

    Lin, T.; Tseng, H.; Wu, A.; Dogan, N.; Meador, J.

    1990-01-01

    This paper discusses the application of neural network pattern analysis algorithms to the IC fault diagnosis problem. A fault diagnostic is a decision rule combining what is known about an ideal circuit test response with information about how it is distorted by fabrication variations and measurement noise. The rule is used to detect fault existence in fabricated circuits using real test equipment. Traditional statistical techniques may be used to achieve this goal, but they can employ unrealistic a priori assumptions about measurement data. Our approach to this problem employs an adaptive pattern analysis technique based on feedforward neural networks. During training, a feedforward network automatically captures unknown sample distributions. This is important because distributions arising from the nonlinear effects of process variation can be more complex than is typically assumed. A feedforward network is also able to extract measurement features which contribute significantly to making a correct decision. Traditional feature extraction techniques employ matrix manipulations which can be particularly costly for large measurement vectors. In this paper we discuss a software system which we are developing that uses this approach. We also provide a simple example illustrating the use of the technique for fault detection in an operational amplifier.

  6. Modeling and Hazard Analysis Using STPA

    NASA Astrophysics Data System (ADS)

    Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka

    2010-09-01

    A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.

  7. Reachability analysis of real-time systems using time Petri nets.

    PubMed

    Wang, J; Deng, Y; Xu, G

    2000-01-01

    Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.

  8. Performance Analysis of a De-correlated Modified Code Tracking Loop for Synchronous DS-CDMA System under Multiuser Environment

    NASA Astrophysics Data System (ADS)

    Wu, Ya-Ting; Wong, Wai-Ki; Leung, Shu-Hung; Zhu, Yue-Sheng

    This paper presents the performance analysis of a De-correlated Modified Code Tracking Loop (D-MCTL) for synchronous direct-sequence code-division multiple-access (DS-CDMA) systems under multiuser environment. Previous studies have shown that the imbalance of multiple access interference (MAI) in the time lead and time lag portions of the signal causes tracking bias or instability problem in the traditional correlating tracking loop like delay lock loop (DLL) or modified code tracking loop (MCTL). In this paper, we exploit the de-correlating technique to combat the MAI at the on-time code position of the MCTL. Unlike applying the same technique to DLL which requires an extensive search algorithm to compensate the noise imbalance which may introduce small tracking bias under low signal-to-noise ratio (SNR), the proposed D-MCTL has much lower computational complexity and exhibits zero tracking bias for the whole range of SNR, regardless of the number of interfering users. Furthermore, performance analysis and simulations based on Gold codes show that the proposed scheme has better mean square tracking error, mean-time-to-lose-lock and near-far resistance than the other tracking schemes, including traditional DLL (T-DLL), traditional MCTL (T-MCTL) and modified de-correlated DLL (MD-DLL).

  9. Microwave-Assisted Hydro-Distillation of Essential Oil from Rosemary: Comparison with Traditional Distillation

    PubMed Central

    Moradi, Sara; Fazlali, Alireza; Hamedi, Hamid

    Background: Hydro-distillation (HD) method is a traditional technique which is used in most industrial companies. Microwave-assisted Hydro-distillation (MAHD) is an advanced HD technique utilizing a microwave oven in the extraction process. Methods: In this research, MAHD of essential oils from the aerial parts (leaves) of rosemary (Rosmarinus officinalis L.) was studied and the results were compared with those of the conventional HD in terms of extraction time, extraction efficiency, chemical composition, quality of the essential oils and cost of the operation. Results: Microwave hydro-distillation was superior in terms of saving energy and extraction time (30 min, compared to 90 min in HD). Chromatography was used for quantity analysis of the essential oils composition. Quality of essential oil improved in MAHD method due to an increase of 17% in oxygenated compounds. Conclusion: Consequently, microwave hydro-distillation can be used as a substitute of traditional hydro-distillation. PMID:29296263

  10. Microwave-Assisted Hydro-Distillation of Essential Oil from Rosemary: Comparison with Traditional Distillation.

    PubMed

    Moradi, Sara; Fazlali, Alireza; Hamedi, Hamid

    2018-01-01

    Hydro-distillation (HD) method is a traditional technique which is used in most industrial companies. Microwave-assisted Hydro-distillation (MAHD) is an advanced HD technique utilizing a microwave oven in the extraction process. In this research, MAHD of essential oils from the aerial parts (leaves) of rosemary ( Rosmarinus officinalis L. ) was studied and the results were compared with those of the conventional HD in terms of extraction time, extraction efficiency, chemical composition, quality of the essential oils and cost of the operation. Microwave hydro-distillation was superior in terms of saving energy and extraction time (30 min , compared to 90 min in HD). Chromatography was used for quantity analysis of the essential oils composition. Quality of essential oil improved in MAHD method due to an increase of 17% in oxygenated compounds. Consequently, microwave hydro-distillation can be used as a substitute of traditional hydro-distillation.

  11. Role of capillary electrophoresis in the fight against doping in sports.

    PubMed

    Harrison, Christopher R

    2013-08-06

    At present the role of capillary electrophoresis in the detection of doping agents in athletes is, for the most part, nonexistent. More traditional techniques, namely gas and liquid chromatography with mass spectrometric detection, remain the gold standard of antidoping tests. This Feature will investigate the in-roads that capillary electrophoresis has made, the limitations that the technique suffers from, and where the technique may grow into being a key tool for antidoping analysis.

  12. Spatially resolved δ13C analysis using laser ablation isotope ratio mass spectrometry

    NASA Astrophysics Data System (ADS)

    Moran, J.; Riha, K. M.; Nims, M. K.; Linley, T. J.; Hess, N. J.; Nico, P. S.

    2014-12-01

    Inherent geochemical, organic matter, and microbial heterogeneity over small spatial scales can complicate studies of carbon dynamics through soils. Stable isotope analysis has a strong history of helping track substrate turnover, delineate rhizosphere activity zones, and identifying transitions in vegetation cover, but most traditional isotope approaches are limited in spatial resolution by a combination of physical separation techniques (manual dissection) and IRMS instrument sensitivity. We coupled laser ablation sampling with isotope measurement via IRMS to enable spatially resolved analysis over solid surfaces. Once a targeted sample region is ablated the resulting particulates are entrained in a helium carrier gas and passed through a combustion reactor where carbon is converted to CO2. Cyrotrapping of the resulting CO2 enables a reduction in carrier gas flow which improves overall measurement sensitivity versus traditional, high flow sample introduction. Currently we are performing sample analysis at 50 μm resolution, require 65 ng C per analysis, and achieve measurement precision consistent with other continuous flow techniques. We will discuss applications of the laser ablation IRMS (LA-IRMS) system to microbial communities and fish ecology studies to demonstrate the merits of this technique and how similar analytical approaches can be transitioned to soil systems. Preliminary efforts at analyzing soil samples will be used to highlight strengths and limitations of the LA-IRMS approach, paying particular attention to sample preparation requirements, spatial resolution, sample analysis time, and the types of questions most conducive to analysis via LA-IRMS.

  13. The Impact of Multiple Endpoint Dependency on "Q" and "I"[superscript 2] in Meta-Analysis

    ERIC Educational Resources Information Center

    Thompson, Christopher Glen; Becker, Betsy Jane

    2014-01-01

    A common assumption in meta-analysis is that effect sizes are independent. When correlated effect sizes are analyzed using traditional univariate techniques, this assumption is violated. This research assesses the impact of dependence arising from treatment-control studies with multiple endpoints on homogeneity measures "Q" and…

  14. Using Structural Equation Models with Latent Variables to Study Student Growth and Development.

    ERIC Educational Resources Information Center

    Pike, Gary R.

    1991-01-01

    Analysis of data on freshman-to-senior developmental gains in 722 University of Tennessee-Knoxville students provides evidence of the advantages of structural equation modeling with latent variables and suggests that the group differences identified by traditional analysis of variance and covariance techniques may be an artifact of measurement…

  15. Merging Traditional Technique Vocabularies with Democratic Teaching Perspectives in Dance Education: A Consideration of Aesthetic Values and Their Sociopolitical Contexts

    ERIC Educational Resources Information Center

    Dyer, Becky

    2009-01-01

    This article suggests how movement analysis from a socially contextualized perspective can inform understanding about the significance of sociopolitical contexts and aesthetic values in Western dance training. Perspectives of movement analysis provide groundwork for discussing perceivable ways to address discrepancies between democratic and…

  16. Real time on-chip sequential adaptive principal component analysis for data feature extraction and image compression

    NASA Technical Reports Server (NTRS)

    Duong, T. A.

    2004-01-01

    In this paper, we present a new, simple, and optimized hardware architecture sequential learning technique for adaptive Principle Component Analysis (PCA) which will help optimize the hardware implementation in VLSI and to overcome the difficulties of the traditional gradient descent in learning convergence and hardware implementation.

  17. Characterization of Natural Dyes and Traditional Korean Silk Fabric by Surface Analytical Techniques.

    PubMed

    Lee, Jihye; Kang, Min Hwa; Lee, Kang-Bong; Lee, Yeonhee

    2013-05-15

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) and X-ray photoelectron spectroscopy (XPS) are well established surface techniques that provide both elemental and organic information from several monolayers of a sample surface, while also allowing depth profiling or image mapping to be carried out. The static TOF-SIMS with improved performances has expanded the application of TOF-SIMS to the study of a variety of organic, polymeric and biological materials. In this work, TOF-SIMS, XPS and Fourier Transform Infrared (FTIR) measurements were used to characterize commercial natural dyes and traditional silk fabric dyed with plant extracts dyes avoiding the time-consuming and destructive extraction procedures necessary for the spectrophotometric and chromatographic methods previously used. Silk textiles dyed with plant extracts were then analyzed for chemical and functional group identification of their dye components and mordants. TOF-SIMS spectra for the dyed silk fabric showed element ions from metallic mordants, specific fragment ions and molecular ions from plant-extracted dyes. The results of TOF-SIMS, XPS and FTIR are very useful as a reference database for comparison with data about traditional Korean silk fabric and to provide an understanding of traditional dyeing materials. Therefore, this study shows that surface techniques are useful for micro-destructive analysis of plant-extracted dyes and Korean dyed silk fabric.

  18. Extending the knowledge in histochemistry and cell biology.

    PubMed

    Heupel, Wolfgang-Moritz; Drenckhahn, Detlev

    2010-01-01

    Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.

  19. Novel near-infrared sampling apparatus for single kernel analysis of oil content in maize.

    PubMed

    Janni, James; Weinstock, B André; Hagen, Lisa; Wright, Steve

    2008-04-01

    A method of rapid, nondestructive chemical and physical analysis of individual maize (Zea mays L.) kernels is needed for the development of high value food, feed, and fuel traits. Near-infrared (NIR) spectroscopy offers a robust nondestructive method of trait determination. However, traditional NIR bulk sampling techniques cannot be applied successfully to individual kernels. Obtaining optimized single kernel NIR spectra for applied chemometric predictive analysis requires a novel sampling technique that can account for the heterogeneous forms, morphologies, and opacities exhibited in individual maize kernels. In this study such a novel technique is described and compared to less effective means of single kernel NIR analysis. Results of the application of a partial least squares (PLS) derived model for predictive determination of percent oil content per individual kernel are shown.

  20. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, De- CAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  1. Development Context Driven Change Awareness and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Sarma, Anita; Branchaud, Josh; Dwyer, Matthew B.; Person, Suzette; Rungta, Neha; Wang, Yurong; Elbaum, Sebastian

    2014-01-01

    Recent work on workspace monitoring allows conflict prediction early in the development process, however, these approaches mostly use syntactic differencing techniques to compare different program versions. In contrast, traditional change-impact analysis techniques analyze related versions of the program only after the code has been checked into the master repository. We propose a novel approach, DeCAF (Development Context Analysis Framework), that leverages the development context to scope a change impact analysis technique. The goal is to characterize the impact of each developer on other developers in the team. There are various client applications such as task prioritization, early conflict detection, and providing advice on testing that can benefit from such a characterization. The DeCAF framework leverages information from the development context to bound the iDiSE change impact analysis technique to analyze only the parts of the code base that are of interest. Bounding the analysis can enable DeCAF to efficiently compute the impact of changes using a combination of program dependence and symbolic execution based approaches.

  2. Advanced Navigation Strategies For Asteroid Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.

    2010-01-01

    Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.

  3. Time-frequency and advanced frequency estimation techniques for the investigation of bat echolocation calls.

    PubMed

    Kopsinis, Yannis; Aboutanios, Elias; Waters, Dean A; McLaughlin, Steve

    2010-02-01

    In this paper, techniques for time-frequency analysis and investigation of bat echolocation calls are studied. Particularly, enhanced resolution techniques are developed and/or used in this specific context for the first time. When compared to traditional time-frequency representation methods, the proposed techniques are more capable of showing previously unseen features in the structure of bat echolocation calls. It should be emphasized that although the study is focused on bat echolocation recordings, the results are more general and applicable to many other types of signal.

  4. [Modified Misgav-Labach at a tertiary hospital].

    PubMed

    Martínez Ceccopieri, David Alejandro; Barrios Prieto, Ernesto; Martínez Ríos, David

    2012-08-01

    According to several studies from around the globe, the modified Misgav Ladach technique simplifies the surgical procedure for cesarean section, reduces operation time, costs, and complications, and optimizes obstetric and perinatal outcomes. Compare obstetric outcomes between patients operated on using traditional cesarean section technique and those operated on using modified Misgav Ladach technique. The study included 49 patients operated on using traditional cesarean section technique and 47 patients operated on using modified Misgav Ladach technique to compare the outcomes in both surgical techniques. The modified Misgav Ladach technique was associated with more benefits than those of the traditional technique: less surgical bleeding, less operation time, less analgesic total doses, less rescue analgesic doses and less need of more than one analgesic drug. The modified Misgav Ladach surgical technique was associated with better obstetric results than those of the traditional surgical technique; this concurs with the results reported by other national and international studies.

  5. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    PubMed

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Fourier transform infrared microspectroscopy for the analysis of the biochemical composition of C. elegans worms.

    PubMed

    Sheng, Ming; Gorzsás, András; Tuck, Simon

    2016-01-01

    Changes in intermediary metabolism have profound effects on many aspects of C. elegans biology including growth, development and behavior. However, many traditional biochemical techniques for analyzing chemical composition require relatively large amounts of starting material precluding the analysis of mutants that cannot be grown in large amounts as homozygotes. Here we describe a technique for detecting changes in the chemical compositions of C. elegans worms by Fourier transform infrared microspectroscopy. We demonstrate that the technique can be used to detect changes in the relative levels of carbohydrates, proteins and lipids in one and the same worm. We suggest that Fourier transform infrared microspectroscopy represents a useful addition to the arsenal of techniques for metabolic studies of C. elegans worms.

  7. Comparing digital data processing techniques for surface mine and reclamation monitoring

    NASA Technical Reports Server (NTRS)

    Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.

    1982-01-01

    The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.

  8. Defining the questions: a research agenda for nontraditional authentication in arms control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K

    Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less

  9. Assessment of recent advances in measurement techniques for atmospheric carbon dioxide and methane observations

    NASA Astrophysics Data System (ADS)

    Zellweger, Christoph; Emmenegger, Lukas; Firdaus, Mohd; Hatakka, Juha; Heimann, Martin; Kozlova, Elena; Spain, T. Gerard; Steinbacher, Martin; van der Schoot, Marcel V.; Buchmann, Brigitte

    2016-09-01

    Until recently, atmospheric carbon dioxide (CO2) and methane (CH4) measurements were made almost exclusively using nondispersive infrared (NDIR) absorption and gas chromatography with flame ionisation detection (GC/FID) techniques, respectively. Recently, commercially available instruments based on spectroscopic techniques such as cavity ring-down spectroscopy (CRDS), off-axis integrated cavity output spectroscopy (OA-ICOS) and Fourier transform infrared (FTIR) spectroscopy have become more widely available and affordable. This resulted in a widespread use of these techniques at many measurement stations. This paper is focused on the comparison between a CRDS "travelling instrument" that has been used during performance audits within the Global Atmosphere Watch (GAW) programme of the World Meteorological Organization (WMO) with instruments incorporating other, more traditional techniques for measuring CO2 and CH4 (NDIR and GC/FID). We demonstrate that CRDS instruments and likely other spectroscopic techniques are suitable for WMO/GAW stations and allow a smooth continuation of historic CO2 and CH4 time series. Moreover, the analysis of the audit results indicates that the spectroscopic techniques have a number of advantages over the traditional methods which will lead to the improved accuracy of atmospheric CO2 and CH4 measurements.

  10. Analysis of thrips distribution: application of spatial statistics and Kriging

    Treesearch

    John Aleong; Bruce L. Parker; Margaret Skinner; Diantha Howard

    1991-01-01

    Kriging is a statistical technique that provides predictions for spatially and temporally correlated data. Observations of thrips distribution and density in Vermont soils are made in both space and time. Traditional statistical analysis of such data assumes that the counts taken over space and time are independent, which is not necessarily true. Therefore, to analyze...

  11. Something Old, Something New: MBA Program Evaluation Using Shift-Share Analysis and Google Trends

    ERIC Educational Resources Information Center

    Davis, Sarah M.; Rodriguez, A. E.

    2014-01-01

    Shift-share analysis is a decomposition technique that is commonly used to measure attributes of regional change. In this method, regional change is decomposed into its relevant functional and competitive parts. This paper introduces traditional shift-share method and its extensions with examples of its applicability and usefulness for program…

  12. Planning and Scheduling of Software Manufacturing Projects

    DTIC Science & Technology

    1991-03-01

    based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the

  13. Holographic analysis as an inspection method for welded thin-wall tubing

    NASA Technical Reports Server (NTRS)

    Brooks, Lawrence; Mulholland, John; Genin, Joseph; Matthews, Larryl

    1990-01-01

    The feasibility of using holographic interferometry for locating flaws in welded tubing is explored. Two holographic techniques are considered: traditional holographic interferometry and electronic speckle pattern interferometry. Several flaws including cold laps, discontinuities, and tube misalignments are detected.

  14. Navigating Microbiological Food Safety in the Era of Whole-Genome Sequencing

    PubMed Central

    Nasheri, Neda; Petronella, Nicholas; Pagotto, Franco

    2016-01-01

    SUMMARY The epidemiological investigation of a foodborne outbreak, including identification of related cases, source attribution, and development of intervention strategies, relies heavily on the ability to subtype the etiological agent at a high enough resolution to differentiate related from nonrelated cases. Historically, several different molecular subtyping methods have been used for this purpose; however, emerging techniques, such as single nucleotide polymorphism (SNP)-based techniques, that use whole-genome sequencing (WGS) offer a resolution that was previously not possible. With WGS, unlike traditional subtyping methods that lack complete information, data can be used to elucidate phylogenetic relationships and disease-causing lineages can be tracked and monitored over time. The subtyping resolution and evolutionary context provided by WGS data allow investigators to connect related illnesses that would be missed by traditional techniques. The added advantage of data generated by WGS is that these data can also be used for secondary analyses, such as virulence gene detection, antibiotic resistance gene profiling, synteny comparisons, mobile genetic element identification, and geographic attribution. In addition, several software packages are now available to generate in silico results for traditional molecular subtyping methods from the whole-genome sequence, allowing for efficient comparison with historical databases. Metagenomic approaches using next-generation sequencing have also been successful in the detection of nonculturable foodborne pathogens. This review addresses state-of-the-art techniques in microbial WGS and analysis and then discusses how this technology can be used to help support food safety investigations. Retrospective outbreak investigations using WGS are presented to provide organism-specific examples of the benefits, and challenges, associated with WGS in comparison to traditional molecular subtyping techniques. PMID:27559074

  15. Biotechnological advances in the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp.

    PubMed

    Zhao, Guang-Hui; Li, Juan; Blair, David; Li, Xiao-Yan; Elsheikha, Hany M; Lin, Rui-Qing; Zou, Feng-Cai; Zhu, Xing-Quan

    2012-01-01

    Schistosomiasis is a serious parasitic disease caused by blood-dwelling flukes of the genus Schistosoma. Throughout the world, schistosomiasis is associated with high rates of morbidity and mortality, with close to 800 million people at risk of infection. Precise methods for identification of Schistosoma species and diagnosis of schistosomiasis are crucial for an enhanced understanding of parasite epidemiology that informs effective antiparasitic treatment and preventive measures. Traditional approaches for the diagnosis of schistosomiasis include etiological, immunological and imaging techniques. Diagnosis of schistosomiasis has been revolutionized by the advent of new molecular technologies to amplify parasite nucleic acids. Among these, polymerase chain reaction-based methods have been useful in the analysis of genetic variation among Schistosoma spp. Mass spectrometry is now extending the range of biological molecules that can be detected. In this review, we summarize traditional, non-DNA-based diagnostic methods and then describe and discuss the current and developing molecular techniques for the diagnosis, species differentiation and phylogenetic analysis of Schistosoma spp. These exciting techniques provide foundations for further development of more effective and precise approaches to differentiate schistosomes and diagnose schistosomiasis in the clinic, and also have important implication for exploring novel measures to control schistosomiasis in the near future. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Comparative Study of Powdered Ginger Drink Processed by Different Method:Traditional and using Evaporation Machine

    NASA Astrophysics Data System (ADS)

    Apriyana, Wuri; Taufika Rosyida, Vita; Nur Hayati, Septi; Darsih, Cici; Dewi Poeloengasih, Crescentiana

    2017-12-01

    Ginger drink is one of the traditional beverage that became one of the products of interest by consumers in Indonesia. This drink is believed to have excellent properties for the health of the body. In this study, we have compared the moisture content, ash content, metal content and the identified compound of product which processed with traditional technique and using an evaporator machine. The results show that both of products fulfilled some parameters of the Indonesian National Standard for the traditional powdered drink. GC-MS analysis data showed the identified compound of both product. The major of hydrocarbon groups that influenced the flavor such as zingiberene, camphene, beta-phelladrine, beta-sesquepelladrine, curcumene, and beta-bisabolene were found higher in ginger drink powder treated with a machine than those processed traditionally.

  17. Estimating of Soil Texture Using Landsat Imagery: a Case Study in Thatta Tehsil, Sindh

    NASA Astrophysics Data System (ADS)

    Khalil, Zahid

    2016-07-01

    Soil texture is considered as an important environment factor for agricultural growth. It is the most essential part for soil classification in large scale. Today the precise soil information in large scale is of great demand from various stakeholders including soil scientists, environmental managers, land use planners and traditional agricultural users. With the increasing demand of soil properties in fine scale spatial resolution made the traditional laboratory methods inadequate. In addition the costs of soil analysis with precision agriculture systems are more expensive than traditional methods. In this regard, the application of geo-spatial techniques can be used as an alternative for examining soil analysis. This study aims to examine the ability of Geo-spatial techniques in identifying the spatial patterns of soil attributes in fine scale. Around 28 samples of soil were collected from the different areas of Thatta Tehsil, Sindh, Pakistan for analyzing soil texture. An Ordinary Least Square (OLS) regression analysis was used to relate the reflectance values of Landsat8 OLI imagery with the soil variables. The analysis showed there was a significant relationship (p<0.05) of band 2 and 5 with silt% (R2 = 0.52), and band 4 and 6 with clay% (R2 =0.40). The equation derived from OLS analysis was then used for the whole study area for deriving soil attributes. The USDA textural classification triangle was implementing for the derivation of soil texture map in GIS environment. The outcome revealed that the 'sandy loam' was in great quantity followed by loam, sandy clay loam and clay loam. The outcome shows that the Geo-spatial techniques could be used efficiently for mapping soil texture of a larger area in fine scale. This technology helped in decreasing cost, time and increase detailed information by reducing field work to a considerable level.

  18. Fostering multiple repertoires in undergraduate behavior analysis students

    PubMed Central

    Polson, David A. D.

    1995-01-01

    Eight techniques used by the author in teaching an introductory applied behavior analysis course are described: (a) a detailed study guide, (b) frequent tests, (c) composition of practice test questions, (d) in-class study groups, (e) fluency building with a computerized flash-card program, (f) bonus marks for participation during question-and-answer sessions, (g) student presentations that summarize and analyze recently published research, and (h) in-class behavior analysis of comic strips. Together, these techniques require an extensive amount of work by students. Nevertheless, students overwhelmingly prefer this approach to the traditional lecture-midterm-final format, and most earn an A as their final course grade. PMID:22478226

  19. The new Zero-P implant can effectively reduce the risk of postoperative dysphagia and complications compared with the traditional anterior cage and plate: a systematic review and meta-analysis.

    PubMed

    Yin, Mengchen; Ma, Junming; Huang, Quan; Xia, Ye; Shen, Qixing; Zhao, Chenglong; Tao, Jun; Chen, Ni; Yu, Zhingxing; Ye, Jie; Mo, Wen; Xiao, Jianru

    2016-10-18

    The low-profile angle-stable spacer Zero-P is a new kind of cervical fusion system that is claimed to limit the potential drawbacks and complications. The purpose of this meta-analysis was to compare the clinical and radiological results of the new Zero-P implant with those of the traditional anterior cage and plate in the treatment of symptomatic cervical spondylosis, and provides clinicians with evidence on which to base their clinical decision making. The following electronic databases were searched: PMedline, PubMed, EMBASE, the Cochrane Central Register of Controlled Trials, Evidence Based Medicine Reviews, VIP, and CNKI. Conference posters and abstracts were also electronically searched. The efficacy was evaluated in intraoperative time, intraoperative blood loss, fusion rate and dysphagia. For intraoperative time and intraoperative blood loss, the meta-analysis revealed that the Zero-P surgical technique is not superior to the cage and plate technique . For fusion rate, the two techniques both had good bone fusion, however, this difference is not statistically significant. For decrease of JOA and dysphagia, the pooled data showed that the Zero-P surgical technique is superior to the cage and plate technique. Zero-P interbody fusion can attain good clinical efficacy and a satisfactory fusion rate in the treatment of symptomatic cervical spondylosis. It also can effectively reduce the risk of postoperative dysphagia and its complications. However, owing to the lack of long-term follow-up, its long-term efficacy remains unknown.

  20. Acoustics based assessment of respiratory diseases using GMM classification.

    PubMed

    Mayorga, P; Druzgalski, C; Morelos, R L; Gonzalez, O H; Vidales, J

    2010-01-01

    The focus of this paper is to present a method utilizing lung sounds for a quantitative assessment of patient health as it relates to respiratory disorders. In order to accomplish this, applicable traditional techniques within the speech processing domain were utilized to evaluate lung sounds obtained with a digital stethoscope. Traditional methods utilized in the evaluation of asthma involve auscultation and spirometry, but utilization of more sensitive electronic stethoscopes, which are currently available, and application of quantitative signal analysis methods offer opportunities of improved diagnosis. In particular we propose an acoustic evaluation methodology based on the Gaussian Mixed Models (GMM) which should assist in broader analysis, identification, and diagnosis of asthma based on the frequency domain analysis of wheezing and crackles.

  1. CleAir Monitoring System for Particulate Matter: A Case in the Napoleonic Museum in Rome

    PubMed Central

    Bonacquisti, Valerio; Di Michele, Marta; Frasca, Francesca; Chianese, Angelo; Siani, Anna Maria

    2017-01-01

    Monitoring the air particulate concentration both outdoors and indoors is becoming a more relevant issue in the past few decades. An innovative, fully automatic, monitoring system called CleAir is presented. Such a system wants to go beyond the traditional technique (gravimetric analysis), allowing for a double monitoring approach: the traditional gravimetric analysis as well as the optical spectroscopic analysis of the scattering on the same filters in steady-state conditions. The experimental data are interpreted in terms of light percolation through highly scattering matter by means of the stretched exponential evolution. CleAir has been applied to investigate the daily distribution of particulate matter within the Napoleonic Museum in Rome as a test case. PMID:28892016

  2. Characterization of Natural Dyes and Traditional Korean Silk Fabric by Surface Analytical Techniques

    PubMed Central

    Lee, Jihye; Kang, Min Hwa; Lee, Kang-Bong; Lee, Yeonhee

    2013-01-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) and X-ray photoelectron spectroscopy (XPS) are well established surface techniques that provide both elemental and organic information from several monolayers of a sample surface, while also allowing depth profiling or image mapping to be carried out. The static TOF-SIMS with improved performances has expanded the application of TOF-SIMS to the study of a variety of organic, polymeric and biological materials. In this work, TOF-SIMS, XPS and Fourier Transform Infrared (FTIR) measurements were used to characterize commercial natural dyes and traditional silk fabric dyed with plant extracts dyes avoiding the time-consuming and destructive extraction procedures necessary for the spectrophotometric and chromatographic methods previously used. Silk textiles dyed with plant extracts were then analyzed for chemical and functional group identification of their dye components and mordants. TOF-SIMS spectra for the dyed silk fabric showed element ions from metallic mordants, specific fragment ions and molecular ions from plant-extracted dyes. The results of TOF-SIMS, XPS and FTIR are very useful as a reference database for comparison with data about traditional Korean silk fabric and to provide an understanding of traditional dyeing materials. Therefore, this study shows that surface techniques are useful for micro-destructive analysis of plant-extracted dyes and Korean dyed silk fabric. PMID:28809257

  3. Psychobiographical Assessment

    ERIC Educational Resources Information Center

    Munter, Pamela Osborne

    1975-01-01

    Psychobiography, the analysis of public persons by competent clinicians, is discussed as a possible assessment technique. Its position in relation to traditional personality assessment is considered as well as major previous efforts. Psychobiography is a part of the curriculum at several leading universities and suggestions are made for future…

  4. Dynamic Relaxation: A Technique for Detailed Thermo-Elastic Structural Analysis of Transportation Structures

    NASA Astrophysics Data System (ADS)

    Shoukry, Samir N.; William, Gergis W.; Riad, Mourad Y.; McBride, Kevyn C.

    2006-08-01

    Dynamic relaxation is a technique developed to solve static problems through an explicit integration in finite element. The main advantage of such a technique is the ability to solve a large problem in a relatively short time compared with the traditional implicit techniques, especially when using nonlinear material models. This paper describes the use of such a technique in analyzing large transportation structures as dowel jointed concrete pavements and 306-m-long, reinforced concrete bridge superstructure under the effect of temperature variations. The main feature of the pavement model is the detailed modeling of dowel bars and their interfaces with the surrounding concrete using extremely fine mesh of solid elements, while in the bridge structure it is the detailed modeling of the girder-deck interface as well as the bracing members between the girders. The 3DFE results were found to be in a good agreement with experimentally measured data obtained from an instrumented pavements sections and a highway bridge constructed in West Virginia. Thus, such a technique provides a good tool for analyzing the response of large structures to static loads in a fraction of the time required by traditional, implicit finite element methods.

  5. Algorithms for Efficient Computation of Transfer Functions for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1998-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, still-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open- and closed-loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, the present method was up to two orders of magnitude faster than a traditional method. The present method generally showed good to excellent accuracy throughout the range of test frequencies, while traditional methods gave adequate accuracy for lower frequencies, but generally deteriorated in performance at higher frequencies with worst case errors being many orders of magnitude times the correct values.

  6. Rasmussen's legacy: A paradigm change in engineering for safety.

    PubMed

    Leveson, Nancy G

    2017-03-01

    This paper describes three applications of Rasmussen's idea to systems engineering practice. The first is the application of the abstraction hierarchy to engineering specifications, particularly requirements specification. The second is the use of Rasmussen's ideas in safety modeling and analysis to create a new, more powerful type of accident causation model that extends traditional models to better handle human-operated, software-intensive, sociotechnical systems. Because this new model has a formal, mathematical foundation built on systems theory (as was Rasmussen's original model), new modeling and analysis tools become possible. The third application is to engineering hazard analysis. Engineers have traditionally either omitted human from consideration in system hazard analysis or have treated them rather superficially, for example, that they behave randomly. Applying Rasmussen's model of human error to a powerful new hazard analysis technique allows human behavior to be included in engineering hazard analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Cluster analysis and subgrouping to investigate inter-individual variability to non-invasive brain stimulation: a systematic review.

    PubMed

    Pellegrini, Michael; Zoghi, Maryam; Jaberzadeh, Shapour

    2018-01-12

    Cluster analysis and other subgrouping techniques have risen in popularity in recent years in non-invasive brain stimulation research in the attempt to investigate the issue of inter-individual variability - the issue of why some individuals respond, as traditionally expected, to non-invasive brain stimulation protocols and others do not. Cluster analysis and subgrouping techniques have been used to categorise individuals, based on their response patterns, as responder or non-responders. There is, however, a lack of consensus and consistency on the most appropriate technique to use. This systematic review aimed to provide a systematic summary of the cluster analysis and subgrouping techniques used to date and suggest recommendations moving forward. Twenty studies were included that utilised subgrouping techniques, while seven of these additionally utilised cluster analysis techniques. The results of this systematic review appear to indicate that statistical cluster analysis techniques are effective in identifying subgroups of individuals based on response patterns to non-invasive brain stimulation. This systematic review also reports a lack of consensus amongst researchers on the most effective subgrouping technique and the criteria used to determine whether an individual is categorised as a responder or a non-responder. This systematic review provides a step-by-step guide to carrying out statistical cluster analyses and subgrouping techniques to provide a framework for analysis when developing further insights into the contributing factors of inter-individual variability in response to non-invasive brain stimulation.

  8. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  9. Application of a novel new multispectral nanoparticle tracking technique

    NASA Astrophysics Data System (ADS)

    McElfresh, Cameron; Harrington, Tyler; Vecchio, Kenneth S.

    2018-06-01

    Fast, reliable, and accurate particle size analysis techniques must meet the demands of evolving industrial and academic research in areas of functionalized nanoparticle synthesis, advanced materials development, and other nanoscale enabled technologies. In this study a new multispectral particle tracking analysis (m-PTA) technique enabled by the ViewSizer™ 3000 (MANTA Instruments, USA) was evaluated using solutions of monomodal and multimodal gold and polystyrene latex nanoparticles, as well as a spark eroded polydisperse 316L stainless steel nanopowder, and large (non-Brownian) borosilicate particles. It was found that m-PTA performed comparably to the DLS in evaluation of monomodal particle size distributions. When measuring bimodal, trimodal and polydisperse solutions, the m-PTA technique overwhelmingly outperformed traditional dynamic light scattering (DLS) in both peak detection and relative particle concentration analysis. It was also observed that the m-PTA technique is less susceptible to large particle overexpression errors. The ViewSizer™ 3000 was also found to be successful in accurately evaluating sizes and concentrations of monomodal and bimodal sinking borosilicate particles.

  10. Analysis, comparison, and contrast of two primary maintenance contracting techniques used by the Florida Department of Transportation.

    DOT National Transportation Integrated Search

    2016-08-01

    The Florida Department of Transportations (FDOT) Asset Maintenance Contracting Program (AMC) was analyzed during this study to determine if it reduced cost or affected work quality when compared with more-traditional contracts. A survey was conduc...

  11. Bioremediation techniques applied to aqueous media contaminated with mercury.

    PubMed

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  12. Assessment of autonomic response by broad-band respiration

    NASA Technical Reports Server (NTRS)

    Berger, R. D.; Saul, J. P.; Cohen, R. J.

    1989-01-01

    We present a technique for introducing broad-band respiratory perturbations so that the response characteristics of the autonomic nervous system can be determined noninvasively over a wide range of physiologically relevant frequencies. A subject's respiratory bandwidth was broadened by breathing on cue to a sequence of audible tones spaced by Poisson intervals. The transfer function between the respiratory input and the resulting instantaneous heart rate was then computed using spectral analysis techniques. Results using this method are comparable to those found using traditional techniques, but are obtained with an economy of data collection.

  13. Sleep Neurophysiological Dynamics Through the Lens of Multitaper Spectral Analysis

    PubMed Central

    Prerau, Michael J.; Brown, Ritchie E.; Bianchi, Matt T.; Ellenbogen, Jeffrey M.; Purdon, Patrick L.

    2016-01-01

    During sleep, cortical and subcortical structures within the brain engage in highly structured oscillatory dynamics that can be observed in the electroencephalogram (EEG). The ability to accurately describe changes in sleep state from these oscillations has thus been a major goal of sleep medicine. While numerous studies over the past 50 years have shown sleep to be a continuous, multifocal, dynamic process, long-standing clinical practice categorizes sleep EEG into discrete stages through visual inspection of 30-s epochs. By representing sleep as a coarsely discretized progression of stages, vital neurophysiological information on the dynamic interplay between sleep and arousal is lost. However, by using principled time-frequency spectral analysis methods, the rich dynamics of the sleep EEG are immediately visible—elegantly depicted and quantified at time scales ranging from a full night down to individual microevents. In this paper, we review the neurophysiology of sleep through this lens of dynamic spectral analysis. We begin by reviewing spectral estimation techniques traditionally used in sleep EEG analysis and introduce multitaper spectral analysis, a method that makes EEG spectral estimates clearer and more accurate than traditional approaches. Through the lens of the multitaper spectrogram, we review the oscillations and mechanisms underlying the traditional sleep stages. In doing so, we will demonstrate how multitaper spectral analysis makes the oscillatory structure of traditional sleep states instantaneously visible, closely paralleling the traditional hypnogram, but with a richness of information that suggests novel insights into the neural mechanisms of sleep, as well as novel clinical and research applications. PMID:27927806

  14. Elemental investigation of Syrian medicinal plants using PIXE analysis

    NASA Astrophysics Data System (ADS)

    Rihawy, M. S.; Bakraji, E. H.; Aref, S.; Shaban, R.

    2010-09-01

    Particle induced X-ray emission (PIXE) technique has been employed to perform elemental analysis of K, Ca, Mn, Fe, Cu, Zn, Br and Sr for Syrian medicinal plants used traditionally to enhance the body immunity. Plant samples were prepared in a simple dried base. The results were verified by comparing with those obtained from both IAEA-359 and IAEA-V10 reference materials. Relative standard deviations are mostly within ±5-10% suggest good precision. A correlation between the elemental content in each medicinal plant with its traditional remedial usage has been proposed. Both K and Ca are found to be the major elements in the samples. Fe, Mn and Zn have been detected in good levels in most of these plants clarifying their possible contribution to keep the body immune system in good condition. The contribution of the elements in these plants to the dietary recommended intakes (DRI) has been evaluated. Advantages and limitations of PIXE analytical technique in this investigation have been reviewed.

  15. Emerging technologies for the non-invasive characterization of physical-mechanical properties of tablets.

    PubMed

    Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V

    2017-10-30

    The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Atomic spectrometry methods for wine analysis: a critical evaluation and discussion of recent applications.

    PubMed

    Grindlay, Guillermo; Mora, Juan; Gras, Luis; de Loos-Vollebregt, Margaretha T C

    2011-04-08

    The analysis of wine is of great importance since wine components strongly determine its stability, organoleptic or nutrition characteristics. In addition, wine analysis is also important to prevent fraud and to assess toxicological issues. Among the different analytical techniques described in the literature, atomic spectrometry has been traditionally employed for elemental wine analysis due to its simplicity and good analytical figures of merit. The scope of this review is to summarize the main advantages and drawbacks of various atomic spectrometry techniques for elemental wine analysis. Special attention is paid to interferences (i.e. matrix effects) affecting the analysis as well as the strategies available to mitigate them. Finally, latest studies about wine speciation are briefly discussed. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Comparing the Efficiency of Two Different Extraction Techniques in Removal of Maxillary Third Molars: A Randomized Controlled Trial.

    PubMed

    Edward, Joseph; Aziz, Mubarak A; Madhu Usha, Arjun; Narayanan, Jyothi K

    2017-12-01

    Extractions are routine procedures in dental surgery. Traditional extraction techniques use a combination of severing the periodontal attachment, luxation with an elevator, and removal with forceps. A new technique of extraction of maxillary third molar is introduced in this study-Joedds technique, which is compared with the conventional technique. One hundred people were included in the study, the people were divided into two groups by means of simple random sampling. In one group conventional technique of maxillary third molar extraction was used and on second Joedds technique was used. Statistical analysis was carried out with student's t test. Analysis of 100 patients based on parameters showed that the novel joedds technique had minimal trauma to surrounding tissues, less tuberosity and root fractures and the time taken for extraction was <2 min while compared to other group of patients. This novel technique has proved to be better than conventional third molar extraction technique, with minimal complications. If Proper selection of cases and right technique are used.

  18. LOCAD-PTS: Operation of a New System for Microbial Monitoring Aboard the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.; hide

    2008-01-01

    Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.

  19. [An object-oriented intelligent engineering design approach for lake pollution control].

    PubMed

    Zou, Rui; Zhou, Jing; Liu, Yong; Zhu, Xiang; Zhao, Lei; Yang, Ping-Jian; Guo, Huai-Cheng

    2013-03-01

    Regarding the shortage and deficiency of traditional lake pollution control engineering techniques, a new lake pollution control engineering approach was proposed in this study, based on object-oriented intelligent design (OOID) from the perspective of intelligence. It can provide a new methodology and framework for effectively controlling lake pollution and improving water quality. The differences between the traditional engineering techniques and the OOID approach were compared. The key points for OOID were described as object perspective, cause and effect foundation, set points into surface, and temporal and spatial optimization. The blue algae control in lake was taken as an example in this study. The effect of algae control and water quality improvement were analyzed in details from the perspective of object-oriented intelligent design based on two engineering techniques (vertical hydrodynamic mixer and pumping algaecide recharge). The modeling results showed that the traditional engineering design paradigm cannot provide scientific and effective guidance for engineering design and decision-making regarding lake pollution. Intelligent design approach is based on the object perspective and quantitative causal analysis in this case. This approach identified that the efficiency of mixers was much higher than pumps in achieving the goal of low to moderate water quality improvement. However, when the objective of water quality exceeded a certain value (such as the control objective of peak Chla concentration exceeded 100 microg x L(-1) in this experimental water), the mixer cannot achieve this goal. The pump technique can achieve the goal but with higher cost. The efficiency of combining the two techniques was higher than using one of the two techniques alone. Moreover, the quantitative scale control of the two engineering techniques has a significant impact on the actual project benefits and costs.

  20. Physicochemical characterization of an Indian traditional medicine, Jasada Bhasma: detection of nanoparticles containing non-stoichiometric zinc oxide

    NASA Astrophysics Data System (ADS)

    Bhowmick, Tridib Kumar; Suresh, Akkihebbal K.; Kane, Shantaram G.; Joshi, Ajit C.; Bellare, Jayesh R.

    2009-04-01

    Herbs and minerals are the integral parts of traditional systems of medicine in many countries. Herbo-Mineral medicinal preparations called Bhasma are unique to the Ayurvedic and Siddha systems of Indian Traditional Medicine. These preparations have been used since long and are claimed to be the very effective and potent dosage form. However, there is dearth of scientific analytical studies carried out on these products, and even the existing ones suffer from incomplete analysis. Jasada Bhasma is a unique preparation of zinc belonging to this class. This particular preparation has been successfully used by traditional practitioners for the treatment of diabetes and age-related eye diseases. This work presents a first comprehensive physicochemical characterization of Jasada Bhasma using modern state-of-the-art techniques such as X-ray photoelectron spectroscopy (XPS), inductively coupled plasma (ICP), elemental analysis with energy dispersive X-ray analysis (EDAX), dynamic light scattering (DLS), and transmission electron microscopy (TEM). Our analysis shows that the Jasada Bhasma particles are in oxygen deficient state and a clearly identifiable fraction of particles are in the nanometer size range. These properties like oxygen deficiency and nanosize particles in Jasada Bhasma might impart the therapeutic property of this particular type of medicine.

  1. Laser stimulating ST36 with optical fiber induce blood component changes in mice: a Raman spectroscopy study.

    PubMed

    Zhang, Heng; Chen, Zhenyi; Wu, Jiping; Chen, Na; Xu, Wenjie; Li, Taihao; Liu, Shupeng

    2018-02-15

    ST36 is a commonly-used acupoint in traditional Chinese medicine (TCM) for treatment of inflammations, pains and gastrointestinal disturbs. For decades, the low power laser acupuncture has been widely applied as an alternative therapy to traditional metal needle acupuncture and achieved relatively fine therapeutic effect for ST36-related symptoms with reduction of uncomfortableness and infection risks. However its disadvantages of low penetrativity and lack of manipulation skills limit its potential performance. An optical fiber laser acupuncture introduced by the previous study combines traditional needling acupuncture and the laser stimulation together, making a stronger therapeutic effect and showing a potential value in clinical application. To evaluate its acupunctural effect on blood, mice are taken as experimental model and Raman spectroscopic technique is used to analysis the changes of blood components after stimulating on ST36. The results show that both the traditional needling acupuncture and optical fiber acupuncture could lead to some spectral changes of blood in mice. This study explores the optical fiber acupuncture's effect on blood in mice using Raman spectroscopy technique for mechanism of acupuncture therapy. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from whichmore » unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.« less

  3. Comparing Geologic Data Sets Collected by Planetary Analog Traverses and by Standard Geologic Field Mapping: Desert Rats Data Analysis

    NASA Technical Reports Server (NTRS)

    Feng, Wanda; Evans, Cynthia; Gruener, John; Eppler, Dean

    2014-01-01

    Geologic mapping involves interpreting relationships between identifiable units and landforms to understand the formative history of a region. Traditional field techniques are used to accomplish this on Earth. Mapping proves more challenging for other planets, which are studied primarily by orbital remote sensing and, less frequently, by robotic and human surface exploration. Systematic comparative assessments of geologic maps created by traditional mapping versus photogeology together with data from planned traverses are limited. The objective of this project is to produce a geologic map from data collected on the Desert Research and Technology Studies (RATS) 2010 analog mission using Apollo-style traverses in conjunction with remote sensing data. This map is compared with a geologic map produced using standard field techniques.

  4. Using Multilevel Modeling in Language Assessment Research: A Conceptual Introduction

    ERIC Educational Resources Information Center

    Barkaoui, Khaled

    2013-01-01

    This article critiques traditional single-level statistical approaches (e.g., multiple regression analysis) to examining relationships between language test scores and variables in the assessment setting. It highlights the conceptual, methodological, and statistical problems associated with these techniques in dealing with multilevel or nested…

  5. Creating a smART Camp

    ERIC Educational Resources Information Center

    Maurer, Matthew J.; Tokarsky, Rebecca; Zalewsky, Laura

    2011-01-01

    Many of the skills and talents required to be a successful scientist, such as analysis, experimentation, and creativity, can be developed and reinforced through art. Both science and art challenge students to make observations, experiment with different techniques, and use both traditional and nontraditional methods to express their ideas. The…

  6. Direct Allocation Costing: Informed Management Decisions in a Changing Environment.

    ERIC Educational Resources Information Center

    Mancini, Cesidio G.; Goeres, Ernest R.

    1995-01-01

    It is argued that colleges and universities can use direct allocation costing to provide quantitative information needed for decision making. This method of analysis requires institutions to modify traditional ideas of costing, looking to the private sector for examples of accurate costing techniques. (MSE)

  7. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  8. New method for stock-tank oil compositional analysis.

    PubMed

    McAndrews, Kristine; Nighswander, John; Kotzakoulakis, Konstantin; Ross, Paul; Schroeder, Helmut

    2009-01-01

    A new method for accurately determining stock-tank oil composition to normal pentatriacontane using gas chromatography is developed and validated. The new method addresses the potential errors associated with the traditional equipment and technique employed for extended hydrocarbon gas chromatography outside a controlled laboratory environment, such as on an offshore oil platform. In particular, the experimental measurement of stock-tank oil molecular weight with the freezing point depression technique and the use of an internal standard to find the unrecovered sample fraction are replaced with correlations for estimating these properties. The use of correlations reduces the number of necessary experimental steps in completing the required sample preparation and analysis, resulting in reduced uncertainty in the analysis.

  9. From air to rubber: New techniques for measuring and replicating mouthpieces, bocals, and bores

    NASA Astrophysics Data System (ADS)

    Fuks, Leonardo

    2002-11-01

    The history of musical instruments comprises a long genealogy of models and prototypes that results from a combination of copying existing specimens with the change in constructive parameters, and the addition of new devices. In making wind instruments, several techniques have been traditionally employed for extracting the external and internal dimensions of toneholes, air columns, bells, and mouthpieces. In the twentieth century, methods such as pulse reflectometry, x-ray, magnetic resonance, and ultrasound imaging have been made available for bore measurement. Advantages and drawbacks of the existing methods are discussed and a new method is presented that makes use of the injection and coating of silicon rubber, for accurate molding of the instrument. This technique is harmless to all traditional materials, being indicated also for measurements of historical instruments. The paper presents dimensional data obtained from clarinet and saxophone mouthpieces. A set of replicas of top quality clarinet and saxophone mouthpieces, trombone bocals, and flute headjoints is shown, with comparative acoustical and performance analyses. The application of such techniques for historical and modern instrument analysis, restoration, and manufacturing is proposed.

  10. Nanomanipulation using near field photonics.

    PubMed

    Erickson, David; Serey, Xavier; Chen, Yih-Fan; Mandal, Sudeep

    2011-03-21

    In this article we review the use of near-field photonics for trapping, transport and handling of nanomaterials. While the advantages of traditional optical tweezing are well known at the microscale, direct application of these techniques to the handling of nanoscale materials has proven difficult due to unfavourable scaling of the fundamental physics. Recently a number of research groups have demonstrated how the evanescent fields surrounding photonic structures like photonic waveguides, optical resonators, and plasmonic nanoparticles can be used to greatly enhance optical forces. Here, we introduce some of the most common implementations of these techniques, focusing on those which have relevance to microfluidic or optofluidic applications. Since the field is still relatively nascent, we spend much of the article laying out the fundamental and practical advantages that near field optical manipulation offers over both traditional optical tweezing and other particle handling techniques. In addition we highlight three application areas where these techniques namely could be of interest to the lab-on-a-chip community, namely: single molecule analysis, nanoassembly, and optical chromatography. This journal is © The Royal Society of Chemistry 2011

  11. Use of the fluorescence of rhodamine B for the pH sensing of a glycine solution

    NASA Astrophysics Data System (ADS)

    Zhang, Weiwei; Shi, Kaixing; Shi, Jiulin; He, Xingdao

    2016-10-01

    The fluorescence of rhodamine B can be strongly affected by its environmental pH value. By directly introducing the dye into various glycine solution, the fluorescence was used to monitor the pH value in the range of 5.9 6.7. Two newly developed techniques for broadband analysis, the barycenter technique and the self-referenced intensity ratio technique, were employed to retrieve the pH sensing functions. While compared with traditional techniques, e.g. the peak shift monitoring, both the two new techniques presented finer precision. The obtained sensing functions may find their applications in the test of biochemical samples, body tissue fluid, water quality, etc.

  12. Big data in medical science--a biostatistical view.

    PubMed

    Binder, Harald; Blettner, Maria

    2015-02-27

    Inexpensive techniques for measurement and data storage now enable medical researchers to acquire far more data than can conveniently be analyzed by traditional methods. The expression "big data" refers to quantities on the order of magnitude of a terabyte (1012 bytes); special techniques must be used to evaluate such huge quantities of data in a scientifically meaningful way. Whether data sets of this size are useful and important is an open question that currently confronts medical science. In this article, we give illustrative examples of the use of analytical techniques for big data and discuss them in the light of a selective literature review. We point out some critical aspects that should be considered to avoid errors when large amounts of data are analyzed. Machine learning techniques enable the recognition of potentially relevant patterns. When such techniques are used, certain additional steps should be taken that are unnecessary in more traditional analyses; for example, patient characteristics should be differentially weighted. If this is not done as a preliminary step before similarity detection, which is a component of many data analysis operations, characteristics such as age or sex will be weighted no higher than any one out of 10 000 gene expression values. Experience from the analysis of conventional observational data sets can be called upon to draw conclusions about potential causal effects from big data sets. Big data techniques can be used, for example, to evaluate observational data derived from the routine care of entire populations, with clustering methods used to analyze therapeutically relevant patient subgroups. Such analyses can provide complementary information to clinical trials of the classic type. As big data analyses become more popular, various statistical techniques for causality analysis in observational data are becoming more widely available. This is likely to be of benefit to medical science, but specific adaptations will have to be made according to the requirements of the applications.

  13. Automatic phase aberration compensation for digital holographic microscopy based on deep learning background detection.

    PubMed

    Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George

    2017-06-26

    We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.

  14. Rapid Monitoring of Bacteria and Fungi aboard the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Gunter, D.; Flores, G.; Effinger, M.; Maule, J.; Wainwright, N.; Steele, A.; Damon, M.; Wells, M.; Williams, S.; Morris, H.; hide

    2009-01-01

    Microorganisms within spacecraft have traditionally been monitored with culture-based techniques. These techniques involve growth of environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies or return of samples to Earth for ground-based analysis. Data obtained over the past 4 decades have enhanced our understanding of the microbial ecology within space stations. However, the approach has been limited by the following factors: i) Many microorganisms (estimated > 95%) in the environment cannot grow on conventional growth media; ii) Significant time lags (3-5 days for incubation and up to several months to return samples to ground); iii) Condensation in contact slides hinders colony counting by crew; and iv) Growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and beta-1, 3-glucan, found in the cell walls of gramnegative bacteria and fungi, respectively. The technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device, known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). LOCADPTS was launched to the ISS in December 2006, and here we present data obtained from Mach 2007 until the present day. These data include a comparative study between LOCADPTS analysis and existing culture-based methods; and an exploratory survey of surface endotoxin and beta-1, 3-glucan throughout the ISS. While a general correlation between LOCAD-PTS and traditional culture-based methods should not be expected, we will suggest new requirements for microbial monitoring based upon culture-independent parameters measured by LOCAD-PTS.

  15. Evaluating the use of laser radiation in cleaning of copper embroidery threads on archaeological Egyptian textiles

    NASA Astrophysics Data System (ADS)

    Abdel-Kareem, Omar; Harith, M. A.

    2008-07-01

    Cleaning of copper embroidery threads on archaeological textiles is still a complicated conservation process, as most textile conservators believe that the advantages of using traditional cleaning techniques are less than their disadvantages. In this study, the uses of laser cleaning method and two modified recipes of wet cleaning methods were evaluated for cleaning of the corroded archaeological Egyptian copper embroidery threads on an archaeological Egyptian textile fabric. Some corroded copper thread samples were cleaned using modified recipes of wet cleaning method; other corroded copper thread samples were cleaned with Q-switched Nd:YAG laser radiation of wavelength 532 nm. All tested metal thread samples before and after cleaning were investigated using a light microscope and a scanning electron microscope with an energy dispersive X-ray analysis unit. Also the laser-induced breakdown spectroscopy (LIBS) technique was used for the elemental analysis of laser-cleaned samples to follow up the laser cleaning procedure. The results show that laser cleaning is the most effective method among all tested methods in the cleaning of corroded copper threads. It can be used safely in removing the corrosion products without any damage to both metal strips and fibrous core. The tested laser cleaning technique has solved the problems caused by other traditional cleaning techniques that are commonly used in the cleaning of metal threads on museum textiles.

  16. Advancing statistical analysis of ambulatory assessment data in the study of addictive behavior: A primer on three person-oriented techniques.

    PubMed

    Foster, Katherine T; Beltz, Adriene M

    2018-08-01

    Ambulatory assessment (AA) methodologies have the potential to increase understanding and treatment of addictive behavior in seemingly unprecedented ways, due in part, to their emphasis on intensive repeated assessments of an individual's addictive behavior in context. But, many analytic techniques traditionally applied to AA data - techniques that average across people and time - do not fully leverage this potential. In an effort to take advantage of the individualized, temporal nature of AA data on addictive behavior, the current paper considers three underutilized person-oriented analytic techniques: multilevel modeling, p-technique, and group iterative multiple model estimation. After reviewing prevailing analytic techniques, each person-oriented technique is presented, AA data specifications are mentioned, an example analysis using generated data is provided, and advantages and limitations are discussed; the paper closes with a brief comparison across techniques. Increasing use of person-oriented techniques will substantially enhance inferences that can be drawn from AA data on addictive behavior and has implications for the development of individualized interventions. Copyright © 2017. Published by Elsevier Ltd.

  17. Neural networks and traditional time series methods: a synergistic combination in state economic forecasts.

    PubMed

    Hansen, J V; Nelson, R D

    1997-01-01

    Ever since the initial planning for the 1997 Utah legislative session, neural-network forecasting techniques have provided valuable insights for analysts forecasting tax revenues. These revenue estimates are critically important since agency budgets, support for education, and improvements to infrastructure all depend on their accuracy. Underforecasting generates windfalls that concern taxpayers, whereas overforecasting produces budget shortfalls that cause inadequately funded commitments. The pattern finding ability of neural networks gives insightful and alternative views of the seasonal and cyclical components commonly found in economic time series data. Two applications of neural networks to revenue forecasting clearly demonstrate how these models complement traditional time series techniques. In the first, preoccupation with a potential downturn in the economy distracts analysis based on traditional time series methods so that it overlooks an emerging new phenomenon in the data. In this case, neural networks identify the new pattern that then allows modification of the time series models and finally gives more accurate forecasts. In the second application, data structure found by traditional statistical tools allows analysts to provide neural networks with important information that the networks then use to create more accurate models. In summary, for the Utah revenue outlook, the insights that result from a portfolio of forecasts that includes neural networks exceeds the understanding generated from strictly statistical forecasting techniques. In this case, the synergy clearly results in the whole of the portfolio of forecasts being more accurate than the sum of the individual parts.

  18. Analysis and Defense of Vulnerabilities in Binary Code

    DTIC Science & Technology

    2008-09-29

    language . We demonstrate our techniques by automatically generating input filters from vulnerable binary programs. vi Acknowledgments I thank my wife, family...21 2.2 The Vine Intermediate Language . . . . . . . . . . . . . . . . . . . . . . 21 ix 2.2.1 Normalized Memory...The Traditional Weakest Precondition Semantics . . . . . . . . . . . . . 44 3.2.1 The Guarded Command Language . . . . . . . . . . . . . . . . . 44

  19. Developing Distance Learning Courses in a "Traditional" University.

    ERIC Educational Resources Information Center

    Lawton, Sally; Barnes, Richard

    1998-01-01

    Comparison of distance learning that was developed with a business-planning approach (market research, cost-benefit analysis, feasibility study, strategic marketing) with one that did not use these techniques showed that business planning ensures that distance-learning courses are not viewed as a "cheap" option. The method identifies…

  20. Collected Notes on the Workshop for Pattern Discovery in Large Databases

    NASA Technical Reports Server (NTRS)

    Buntine, Wray (Editor); Delalto, Martha (Editor)

    1991-01-01

    These collected notes are a record of material presented at the Workshop. The core data analysis is addressed that have traditionally required statistical or pattern recognition techniques. Some of the core tasks include classification, discrimination, clustering, supervised and unsupervised learning, discovery and diagnosis, i.e., general pattern discovery.

  1. A Content Analysis of Multinationals' Web Communication Strategies: Cross-Cultural Research Framework and Pre-Testing.

    ERIC Educational Resources Information Center

    Okazaki, Shintaro; Alonso Rivas, Javier

    2002-01-01

    Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…

  2. Artificial Neural Networks: A New Approach to Predicting Application Behavior.

    ERIC Educational Resources Information Center

    Gonzalez, Julie M. Byers; DesJardins, Stephen L.

    2002-01-01

    Applied the technique of artificial neural networks to predict which students were likely to apply to one research university. Compared the results to the traditional analysis tool, logistic regression modeling. Found that the addition of artificial intelligence models was a useful new tool for predicting student application behavior. (EV)

  3. Microcomputer Applications for Teaching Microeconomic Concepts: Some Old and New Approaches.

    ERIC Educational Resources Information Center

    Smith, L. Murphy; Smith, L. C., Jr.

    1989-01-01

    Presents microcomputer programs and programing techniques and demonstrates how these programs can be used by teachers to explain economics concepts and to help students make judgments. Each microcomputer application is supplemented by traditional graphic and mathematical analysis. Discusses applications dealing with supply, demand, elasticity,…

  4. Tumescent and syringe liposculpture: a logical partnership.

    PubMed

    Hunstad, J P

    1995-01-01

    Liposuction has been traditionally performed under general anesthesia. Standard instrumentation for the procedure has included blunt-tipped suction cannulae connected to an electric vacuum pump by noncollapsible tubing. A subcutaneous injection of Lidocaine with Epinephrine is routinely employed to minimize blood loss during the procedure. This infiltration has been described as the "wet technique," but it is not a method to supplant general anesthesia. The tumescent technique, a method of infusing very large volumes of dilute lidocaine with epinephrine solutions, has been advocated as a satisfactory means for providing conscious anesthesia for liposuction procedures, avoiding the need for general anesthesia. The syringe technique employs blunt-tipped suction cannulae connected to a syringe. Drawing back the syringe plunger generates the negative pressures needed to remove fat during liposuction and replaces the electric vacuum pump and connecting tubing traditionally used for this procedure. This study evaluates the combined tumescent and syringe techniques for liposuction. One hundred consecutive patients were treated with the tumescent technique as the sole means of anesthesia and the syringe technique as the sole means of performing liposuction. A modified tumescent formula is presented. A comparison of liposuction aspirates using this modified tumescent technique is compared and contrasted to liposuction aspirates obtained using the "dry technique" and the "wet technique." A historical review of the syringe technique and its perceived attributes is also presented. Technical descriptions of the tumescent infusion method, tumescent fluid formulation, and suggested patient sedation and monitoring is presented. Photographic documentation of patients who underwent the combined tumescent and syringe liposculpture treating various body areas is shown. A critical analysis of the limitations of this combined technique is also described noting added time requirements, difficulties with under-correction of deformities, and need for reoperation, methods for determining the "end-point" for the procedure, as well as addressing large-volume liposuction problems. The conclusion reached by this study is that combining the tumescent technique and the syringe technique is a logical partnership. Each method complements the other, allowing liposuction to be performed with considerable advantage over traditional methods. These advantages include eliminating the need for general anesthesia, lessening blood loss and postoperative bruising, greater accuracy, precision, and overall high patient satisfaction.

  5. Comprehensive Analysis of LC/MS Data Using Pseudocolor Plots

    NASA Astrophysics Data System (ADS)

    Crutchfield, Christopher A.; Olson, Matthew T.; Gourgari, Evgenia; Nesterova, Maria; Stratakis, Constantine A.; Yergey, Alfred L.

    2013-02-01

    We have developed new applications of the pseudocolor plot for the analysis of LC/MS data. These applications include spectral averaging, analysis of variance, differential comparison of spectra, and qualitative filtering by compound class. These applications have been motivated by the need to better understand LC/MS data generated from analysis of human biofluids. The examples presented use data generated to profile steroid hormones in urine extracts from a Cushing's disease patient relative to a healthy control, but are general to any discovery-based scanning mass spectrometry technique. In addition to new visualization techniques, we introduce a new metric of variance: the relative maximum difference from the mean. We also introduce the concept of substructure-dependent analysis of steroid hormones using precursor ion scans. These new analytical techniques provide an alternative approach to traditional untargeted metabolomics workflow. We present an approach to discovery using MS that essentially eliminates alignment or preprocessing of spectra. Moreover, we demonstrate the concept that untargeted metabolomics can be achieved using low mass resolution instrumentation.

  6. A Comparison of Collaborative and Traditional Instruction in Higher Education

    ERIC Educational Resources Information Center

    Gubera, Chip; Aruguete, Mara S.

    2013-01-01

    Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…

  7. A Chromosome-Scale Assembly of the Bactrocera cucurbitae Genome Provides Insight to the Genetic Basis of white pupae

    PubMed Central

    Sim, Sheina B.; Geib, Scott M.

    2017-01-01

    Genetic sexing strains (GSS) used in sterile insect technique (SIT) programs are textbook examples of how classical Mendelian genetics can be directly implemented in the management of agricultural insect pests. Although the foundation of traditionally developed GSS are single locus, autosomal recessive traits, their genetic basis are largely unknown. With the advent of modern genomic techniques, the genetic basis of sexing traits in GSS can now be further investigated. This study is the first of its kind to integrate traditional genetic techniques with emerging genomics to characterize a GSS using the tephritid fruit fly pest Bactrocera cucurbitae as a model. These techniques include whole-genome sequencing, the development of a mapping population and linkage map, and quantitative trait analysis. The experiment designed to map the genetic sexing trait in B. cucurbitae, white pupae (wp), also enabled the generation of a chromosome-scale genome assembly by integrating the linkage map with the assembly. Quantitative trait loci analysis revealed SNP loci near position 42 MB on chromosome 3 to be tightly linked to wp. Gene annotation and synteny analysis show a near perfect relationship between chromosomes in B. cucurbitae and Muller elements A–E in Drosophila melanogaster. This chromosome-scale genome assembly is complete, has high contiguity, was generated using a minimal input DNA, and will be used to further characterize the genetic mechanisms underlying wp. Knowledge of the genetic basis of genetic sexing traits can be used to improve SIT in this species and expand it to other economically important Diptera. PMID:28450369

  8. Lessons from Trees.

    ERIC Educational Resources Information Center

    Elrick, Mike

    2003-01-01

    Traditional techniques and gear are better suited for comfortable extended wilderness trips with high school students than are emerging technologies and techniques based on low-impact camping and petroleum-based clothing, which send students the wrong messages about ecological relatedness and sustainability. Traditional travel techniques and…

  9. Application of phyto-indication and radiocesium indicative methods for microrelief mapping

    NASA Astrophysics Data System (ADS)

    Panidi, E.; Trofimetz, L.; Sokolova, J.

    2016-04-01

    Remote sensing technologies are widely used for production of Digital Elevation Models (DEMs), and geomorphometry techniques are valuable tools for DEM analysis. One of the broadly used applications of these technologies and techniques is relief mapping. In the simplest case, we can identify relief structures using DEM analysis, and produce a map or map series to show the relief condition. However, traditional techniques might fail when used for mapping microrelief structures (structures below ten meters in size). In this case high microrelief dynamics lead to technological and conceptual difficulties. Moreover, erosion of microrelief structures cannot be detected at the initial evolution stage using DEM modelling and analysis only. In our study, we investigate the possibilities and specific techniques for allocation of erosion microrelief structures, and mapping techniques for the microrelief derivatives (e.g. quantitative parameters of microrelief). Our toolset includes the analysis of spatial redistribution of the soil pollutants and phyto-indication analysis, which complement the common DEM modelling and geomorphometric analysis. We use field surveys produced at the test area, which is arable territory with high erosion risks. Our main conclusion at the current stage is that the indicative methods (i.e. radiocesium and phyto-indication methods) are effective for allocation of the erosion microrelief structures. Also, these methods need to be formalized for convenient use.

  10. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    PubMed Central

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  11. Image Analysis Technique for Material Behavior Evaluation in Civil Structures.

    PubMed

    Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca

    2017-07-08

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.

  12. A review of second law techniques applicable to basic thermal science research

    NASA Astrophysics Data System (ADS)

    Drost, M. Kevin; Zamorski, Joseph R.

    1988-11-01

    This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.

  13. A systematic mapping study of process mining

    NASA Astrophysics Data System (ADS)

    Maita, Ana Rocío Cárdenas; Martins, Lucas Corrêa; López Paz, Carlos Ramón; Rafferty, Laura; Hung, Patrick C. K.; Peres, Sarajane Marques; Fantinato, Marcelo

    2018-05-01

    This study systematically assesses the process mining scenario from 2005 to 2014. The analysis of 705 papers evidenced 'discovery' (71%) as the main type of process mining addressed and 'categorical prediction' (25%) as the main mining task solved. The most applied traditional technique is the 'graph structure-based' ones (38%). Specifically concerning computational intelligence and machine learning techniques, we concluded that little relevance has been given to them. The most applied are 'evolutionary computation' (9%) and 'decision tree' (6%), respectively. Process mining challenges, such as balancing among robustness, simplicity, accuracy and generalization, could benefit from a larger use of such techniques.

  14. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  15. Alternatives to current flow cytometry data analysis for clinical and research studies.

    PubMed

    Gondhalekar, Carmen; Rajwa, Bartek; Patsekin, Valery; Ragheb, Kathy; Sturgis, Jennifer; Robinson, J Paul

    2018-02-01

    Flow cytometry has well-established methods for data analysis based on traditional data collection techniques. These techniques typically involved manual insertion of tube samples into an instrument that, historically, could only measure 1-3 colors. The field has since evolved to incorporate new technologies for faster and highly automated sample preparation and data collection. For example, the use of microwell plates on benchtop instruments is now a standard on virtually every new instrument, and so users can easily accumulate multiple data sets quickly. Further, because the user must carefully define the layout of the plate, this information is already defined when considering the analytical process, expanding the opportunities for automated analysis. Advances in multi-parametric data collection, as demonstrated by the development of hyperspectral flow-cytometry, 20-40 color polychromatic flow cytometry, and mass cytometry (CyTOF), are game-changing. As data and assay complexity increase, so too does the complexity of data analysis. Complex data analysis is already a challenge to traditional flow cytometry software. New methods for reviewing large and complex data sets can provide rapid insight into processes difficult to define without more advanced analytical tools. In settings such as clinical labs where rapid and accurate data analysis is a priority, rapid, efficient and intuitive software is needed. This paper outlines opportunities for analysis of complex data sets using examples of multiplexed bead-based assays, drug screens and cell cycle analysis - any of which could become integrated into the clinical environment. Copyright © 2017. Published by Elsevier Inc.

  16. Current application of chemometrics in traditional Chinese herbal medicine research.

    PubMed

    Huang, Yipeng; Wu, Zhenwei; Su, Rihui; Ruan, Guihua; Du, Fuyou; Li, Gongke

    2016-07-15

    Traditional Chinese herbal medicines (TCHMs) are promising approach for the treatment of various diseases which have attracted increasing attention all over the world. Chemometrics in quality control of TCHMs are great useful tools that harnessing mathematics, statistics and other methods to acquire information maximally from the data obtained from various analytical approaches. This feature article focuses on the recent studies which evaluating the pharmacological efficacy and quality of TCHMs by determining, identifying and discriminating the bioactive or marker components in different samples with the help of chemometric techniques. In this work, the application of chemometric techniques in the classification of TCHMs based on their efficacy and usage was introduced. The recent advances of chemometrics applied in the chemical analysis of TCHMs were reviewed in detail. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Fundamentals of Digital Engineering: Designing for Reliability

    NASA Technical Reports Server (NTRS)

    Katz, R.; Day, John H. (Technical Monitor)

    2001-01-01

    The concept of designing for reliability will be introduced along with a brief overview of reliability, redundancy and traditional methods of fault tolerance is presented, as applied to current logic devices. The fundamentals of advanced circuit design and analysis techniques will be the primary focus. The introduction will cover the definitions of key device parameters and how analysis is used to prove circuit correctness. Basic design techniques such as synchronous vs asynchronous design, metastable state resolution time/arbiter design, and finite state machine structure/implementation will be reviewed. Advanced topics will be explored such as skew-tolerant circuit design, the use of triple-modular redundancy and circuit hazards, device transients and preventative circuit design, lock-up states in finite state machines generated by logic synthesizers, device transient characteristics, radiation mitigation techniques. worst-case analysis, the use of timing analyzer and simulators, and others. Case studies and lessons learned from spaceflight designs will be given as examples

  18. Profiling of Piper betle Linn. cultivars by direct analysis in real time mass spectrometric technique.

    PubMed

    Bajpai, Vikas; Sharma, Deepty; Kumar, Brijesh; Madhusudanan, K P

    2010-12-01

    Piper betle Linn. is a traditional plant associated with the Asian and southeast Asian cultures. Its use is also recorded in folk medicines in these regions. Several of its medicinal properties have recently been proven. Phytochemical analysis showed the presence of mainly terpenes and phenols in betel leaves. These constituents vary in the different cultivars of Piper betle. In this paper we have attempted to profile eight locally available betel cultivars using the recently developed mass spectral ionization technique of direct analysis in real time (DART). Principal component analysis has also been employed to analyze the DART MS data of these betel cultivars. The results show that the cultivars of Piper betle could be differentiated using DART MS data. Copyright © 2010 John Wiley & Sons, Ltd.

  19. Simultaneous analysis and design

    NASA Technical Reports Server (NTRS)

    Haftka, R. T.

    1984-01-01

    Optimization techniques are increasingly being used for performing nonlinear structural analysis. The development of element by element (EBE) preconditioned conjugate gradient (CG) techniques is expected to extend this trend to linear analysis. Under these circumstances the structural design problem can be viewed as a nested optimization problem. There are computational benefits to treating this nested problem as a large single optimization problem. The response variables (such as displacements) and the structural parameters are all treated as design variables in a unified formulation which performs simultaneously the design and analysis. Two examples are used for demonstration. A seventy-two bar truss is optimized subject to linear stress constraints and a wing box structure is optimized subject to nonlinear collapse constraints. Both examples show substantial computational savings with the unified approach as compared to the traditional nested approach.

  20. Constitutive parameter measurements of lossy materials

    NASA Technical Reports Server (NTRS)

    Dominek, A.; Park, A.

    1989-01-01

    The electrical constitutive parameters of lossy materials are considered. A discussion of the NRL arch for lossy coatings is presented involving analytical analyses of the reflected field using the geometrical theory of diffraction (GTD) and physical optics (PO). The actual values for these parameters can be obtained through a traditional transmission technique which is examined from an error analysis standpoint. Alternate sample geometries are suggested for this technique to reduce sample tolerance requirements for accurate parameter determination. The performance for one alternate geometry is given.

  1. Hardware Demonstration: Frequency Spectra of Transients

    NASA Technical Reports Server (NTRS)

    McCloskey, John; Dimov, Jen

    2017-01-01

    Radiated emissions measurements as specified by MIL-STD-461 are performed in the frequency domain, which is best suited to continuous wave (CW) types of signals. However, many platforms implement signals that are single event pulses or transients. Such signals can potentially generate momentary radiated emissions that can cause interference in the system, but they may be missed with traditional measurement techniques. This demonstration provides measurement and analysis techniques that effectively evaluate the potential emissions from such signals in order to evaluate their potential impacts to system performance.

  2. Radioimmunoassays and 2-site immunoradiometric "sandwich" assays: basic principles.

    PubMed

    Rodbard, D

    1988-10-01

    The "sandwich" or noncompetitive reagent-excess, 2-site immunoradiometric assay (2-site IRMA), ELISA, USERIA, and related techniques, have several advantages compared with the traditional or competitive radioimmunoassays. IRMAs can provide improved sensitivity and specificity. However, IRMAs present some practical problems with nonspecific binding, increased consumption of antibody, biphasic dose response curve, (high dose hook effect), and may require special techniques for dose response curve analysis. We anticipate considerable growth in the popularity and importance of 2-site IRMA.

  3. New optical frequency domain differential mode delay measurement method for a multimode optical fiber.

    PubMed

    Ahn, T; Moon, S; Youk, Y; Jung, Y; Oh, K; Kim, D

    2005-05-30

    A novel mode analysis method and differential mode delay (DMD) measurement technique for a multimode optical fiber based on optical frequency domain reflectometry has been proposed for the first time. We have used a conventional OFDR with a tunable external cavity laser and a Michelson interferometer. A few-mode optical multimode fiber was prepared to test our proposed measurement technique. We have also compared the OFDR measurement results with those obtained using a traditional time-domain measurement method.

  4. Comparison of DGT with traditional extraction methods for assessing arsenic bioavailability to Brassica chinensis in different soils.

    PubMed

    Dai, Yunchao; Nasir, Mubasher; Zhang, Yulin; Gao, Jiakai; Lv, Yamin; Lv, Jialong

    2018-01-01

    Several predictive models and methods have been used for heavy metals bioavailability, but there is no universally accepted approach in evaluating the bioavailability of arsenic (As) in soil. The technique of diffusive gradients in thin-films (DGT) is a promising tool, but there is a considerable debate with respect to its suitability. The DGT method was compared with other traditional chemical extractions techniques (soil solution, NaHCO 3 , NH 4 Cl, HCl, and total As method) for estimating As bioavailability in soil based on a greenhouse experiment using Brassica chinensis grown in various soils from 15 provinces in China. In addition, we assessed whether these methods are independent of soil properties. The correlations between plant and soil As concentration measured with traditional extraction techniques were pH and iron oxide (Fe ox ) dependent, indicating that these methods are influenced by soil properties. In contrast, DGT measurements were independent of soil properties and also showed a better correlation coefficient than other traditional techniques. Thus, DGT technique is superior to traditional techniques and should be preferable for evaluating As bioavailability in different type of soils. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Comparison of traditional nondestructive analysis of RERTR fuel plates with digital radiographic techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidsmeier, T.; Koehl, R.; Lanham, R.

    2008-07-15

    The current design and fabrication process for RERTR fuel plates utilizes film radiography during the nondestructive testing and characterization. Digital radiographic methods offer a potential increases in efficiency and accuracy. The traditional and digital radiographic methods are described and demonstrated on a fuel plate constructed with and average of 51% by volume fuel using the dispersion method. Fuel loading data from each method is analyzed and compared to a third baseline method to assess accuracy. The new digital method is shown to be more accurate, save hours of work, and provide additional information not easily available in the traditional method.more » Additional possible improvements suggested by the new digital method are also raised. (author)« less

  6. Determination of the transmission coefficients for quantum structures using FDTD method.

    PubMed

    Peng, Yangyang; Wang, Xiaoying; Sui, Wenquan

    2011-12-01

    The purpose of this work is to develop a simple method to incorporate quantum effect in traditional finite-difference time-domain (FDTD) simulators. Witch could make it possible to co-simulate systems include quantum structures and traditional components. In this paper, tunneling transmission coefficient is calculated by solving time-domain Schrödinger equation with a developed FDTD technique, called FDTD-S method. To validate the feasibility of the method, a simple resonant tunneling diode (RTD) structure model has been simulated using the proposed method. The good agreement between the numerical and analytical results proves its accuracy. The effectness and accuracy of this approach makes it a potential method for analysis and design of hybrid systems includes quantum structures and traditional components.

  7. Efficient techniques for forced response involving linear modal components interconnected by discrete nonlinear connection elements

    NASA Astrophysics Data System (ADS)

    Avitabile, Peter; O'Callahan, John

    2009-01-01

    Generally, response analysis of systems containing discrete nonlinear connection elements such as typical mounting connections require the physical finite element system matrices to be used in a direct integration algorithm to compute the nonlinear response analysis solution. Due to the large size of these physical matrices, forced nonlinear response analysis requires significant computational resources. Usually, the individual components of the system are analyzed and tested as separate components and their individual behavior may essentially be linear when compared to the total assembled system. However, the joining of these linear subsystems using highly nonlinear connection elements causes the entire system to become nonlinear. It would be advantageous if these linear modal subsystems could be utilized in the forced nonlinear response analysis since much effort has usually been expended in fine tuning and adjusting the analytical models to reflect the tested subsystem configuration. Several more efficient techniques have been developed to address this class of problem. Three of these techniques given as: equivalent reduced model technique (ERMT);modal modification response technique (MMRT); andcomponent element method (CEM); are presented in this paper and are compared to traditional methods.

  8. Efficient Computation of Closed-loop Frequency Response for Large Order Flexible Systems

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Giesy, Daniel P.

    1997-01-01

    An efficient and robust computational scheme is given for the calculation of the frequency response function of a large order, flexible system implemented with a linear, time invariant control system. Advantage is taken of the highly structured sparsity of the system matrix of the plant based on a model of the structure using normal mode coordinates. The computational time per frequency point of the new computational scheme is a linear function of system size, a significant improvement over traditional, full-matrix techniques whose computational times per frequency point range from quadratic to cubic functions of system size. This permits the practical frequency domain analysis of systems of much larger order than by traditional, full-matrix techniques. Formulations are given for both open and closed loop loop systems. Numerical examples are presented showing the advantages of the present formulation over traditional approaches, both in speed and in accuracy. Using a model with 703 structural modes, a speed-up of almost two orders of magnitude was observed while accuracy improved by up to 5 decimal places.

  9. Applications of cluster analysis to the creation of perfectionism profiles: a comparison of two clustering approaches.

    PubMed

    Bolin, Jocelyn H; Edwards, Julianne M; Finch, W Holmes; Cassady, Jerrell C

    2014-01-01

    Although traditional clustering methods (e.g., K-means) have been shown to be useful in the social sciences it is often difficult for such methods to handle situations where clusters in the population overlap or are ambiguous. Fuzzy clustering, a method already recognized in many disciplines, provides a more flexible alternative to these traditional clustering methods. Fuzzy clustering differs from other traditional clustering methods in that it allows for a case to belong to multiple clusters simultaneously. Unfortunately, fuzzy clustering techniques remain relatively unused in the social and behavioral sciences. The purpose of this paper is to introduce fuzzy clustering to these audiences who are currently relatively unfamiliar with the technique. In order to demonstrate the advantages associated with this method, cluster solutions of a common perfectionism measure were created using both fuzzy clustering and K-means clustering, and the results compared. Results of these analyses reveal that different cluster solutions are found by the two methods, and the similarity between the different clustering solutions depends on the amount of cluster overlap allowed for in fuzzy clustering.

  10. Applications of cluster analysis to the creation of perfectionism profiles: a comparison of two clustering approaches

    PubMed Central

    Bolin, Jocelyn H.; Edwards, Julianne M.; Finch, W. Holmes; Cassady, Jerrell C.

    2014-01-01

    Although traditional clustering methods (e.g., K-means) have been shown to be useful in the social sciences it is often difficult for such methods to handle situations where clusters in the population overlap or are ambiguous. Fuzzy clustering, a method already recognized in many disciplines, provides a more flexible alternative to these traditional clustering methods. Fuzzy clustering differs from other traditional clustering methods in that it allows for a case to belong to multiple clusters simultaneously. Unfortunately, fuzzy clustering techniques remain relatively unused in the social and behavioral sciences. The purpose of this paper is to introduce fuzzy clustering to these audiences who are currently relatively unfamiliar with the technique. In order to demonstrate the advantages associated with this method, cluster solutions of a common perfectionism measure were created using both fuzzy clustering and K-means clustering, and the results compared. Results of these analyses reveal that different cluster solutions are found by the two methods, and the similarity between the different clustering solutions depends on the amount of cluster overlap allowed for in fuzzy clustering. PMID:24795683

  11. Simplified Microarray Technique for Identifying mRNA in Rare Samples

    NASA Technical Reports Server (NTRS)

    Almeida, Eduardo; Kadambi, Geeta

    2007-01-01

    Two simplified methods of identifying messenger ribonucleic acid (mRNA), and compact, low-power apparatuses to implement the methods, are at the proof-of-concept stage of development. These methods are related to traditional methods based on hybridization of nucleic acid, but whereas the traditional methods must be practiced in laboratory settings, these methods could be practiced in field settings. Hybridization of nucleic acid is a powerful technique for detection of specific complementary nucleic acid sequences, and is increasingly being used for detection of changes in gene expression in microarrays containing thousands of gene probes. A traditional microarray study entails at least the following six steps: 1. Purification of cellular RNA, 2. Amplification of complementary deoxyribonucleic acid [cDNA] by polymerase chain reaction (PCR), 3. Labeling of cDNA with fluorophores of Cy3 (a green cyanine dye) and Cy5 (a red cyanine dye), 4. Hybridization to a microarray chip, 5. Fluorescence scanning the array(s) with dual excitation wavelengths, and 6. Analysis of the resulting images. This six-step procedure must be performed in a laboratory because it requires bulky equipment.

  12. Temporomandibular joint arthroscopy technique using a single working cannula.

    PubMed

    Srouji, S; Oren, D; Zoabi, A; Ronen, O; Zraik, H

    2016-11-01

    The traditional arthroscopy technique includes the creation of three ports in order to enable visualization, operation, and arthrocentesis. The aim of this study was to assess an advanced temporomandibular joint (TMJ) arthroscopy technique that requires only a single cannula, through which a one-piece instrument containing a visualization canal, irrigation canal, and a working canal is inserted, as an alternative to the traditional double-puncture technique. This retrospective study assessed eight patients (13 TMJs) with pain and/or limited range of movement that was refractory to conservative therapy, who were treated between June 2015 and December 2015. The temporomandibular joint disorder (TMD) was diagnosed by physical examination and mouth opening measurements. The duration of surgery was recorded and compared to that documented for traditional arthroscopies performed by the same surgeon. Operative single-cannula arthroscopy (OSCA) was performed using a holmium YAG (Ho:YAG) 230μm fibre laser for ablation. The OSCA technique proved effective in improving mouth opening in all patients (mean increase 9.12±1.96mm) and in reducing pain (mean visual analogue scale decrease of 3.25±1.28). The operation time was approximately half that of the traditional technique. The OSCA technique is as efficient as the traditional technique, is simple to learn, and is simpler to execute. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Analyzing simulation-based PRA data through traditional and topological clustering: A BWR station blackout case study

    DOE PAGES

    Maljovec, D.; Liu, S.; Wang, B.; ...

    2015-07-14

    Here, dynamic probabilistic risk assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP and MELCOR) with simulation controller codes (e.g., RAVEN and ADAPT). Whereas system simulator codes model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic and operating procedures) and stochastic (e.g., component failures and parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by sampling values of a set of parameters and simulating the system behavior for that specific set of parameter values. For complex systems, a major challenge in using DPRA methodologies is to analyze the large number of scenarios generated,more » where clustering techniques are typically employed to better organize and interpret the data. In this paper, we focus on the analysis of two nuclear simulation datasets that are part of the risk-informed safety margin characterization (RISMC) boiling water reactor (BWR) station blackout (SBO) case study. We provide the domain experts a software tool that encodes traditional and topological clustering techniques within an interactive analysis and visualization environment, for understanding the structures of such high-dimensional nuclear simulation datasets. We demonstrate through our case study that both types of clustering techniques complement each other for enhanced structural understanding of the data.« less

  14. Analysis of the impact of large scale seismic retrofitting strategies through the application of a vulnerability-based approach on traditional masonry buildings

    NASA Astrophysics Data System (ADS)

    Ferreira, Tiago Miguel; Maio, Rui; Vicente, Romeu

    2017-04-01

    The buildings' capacity to maintain minimum structural safety levels during natural disasters, such as earthquakes, is recognisably one of the aspects that most influence urban resilience. Moreover, the public investment in risk mitigation strategies is fundamental, not only to promote social and urban and resilience, but also to limit consequent material, human and environmental losses. Despite the growing awareness of this issue, there is still a vast number of traditional masonry buildings spread throughout many European old city centres that lacks of adequate seismic resistance, requiring therefore urgent retrofitting interventions in order to both reduce their seismic vulnerability and to cope with the increased seismic requirements of recent code standards. Thus, this paper aims at contributing to mitigate the social and economic impacts of earthquake damage scenarios through the development of vulnerability-based comparative analysis of some of the most popular retrofitting techniques applied after the 1998 Azores earthquake. The influence of each technique individually and globally studied resorting to a seismic vulnerability index methodology integrated into a GIS tool and damage and loss scenarios are constructed and critically discussed. Finally, the economic balance resulting from the implementation of that techniques are also examined.

  15. Formal methods for modeling and analysis of hybrid systems

    NASA Technical Reports Server (NTRS)

    Tiwari, Ashish (Inventor); Lincoln, Patrick D. (Inventor)

    2009-01-01

    A technique based on the use of a quantifier elimination decision procedure for real closed fields and simple theorem proving to construct a series of successively finer qualitative abstractions of hybrid automata is taught. The resulting abstractions are always discrete transition systems which can then be used by any traditional analysis tool. The constructed abstractions are conservative and can be used to establish safety properties of the original system. The technique works on linear and non-linear polynomial hybrid systems: the guards on discrete transitions and the continuous flows in all modes can be specified using arbitrary polynomial expressions over the continuous variables. An exemplar tool in the SAL environment built over the theorem prover PVS is detailed. The technique scales well to large and complex hybrid systems.

  16. An automatic step adjustment method for average power analysis technique used in fiber amplifiers

    NASA Astrophysics Data System (ADS)

    Liu, Xue-Ming

    2006-04-01

    An automatic step adjustment (ASA) method for average power analysis (APA) technique used in fiber amplifiers is proposed in this paper for the first time. In comparison with the traditional APA technique, the proposed method has suggested two unique merits such as a higher order accuracy and an ASA mechanism, so that it can significantly shorten the computing time and improve the solution accuracy. A test example demonstrates that, by comparing to the APA technique, the proposed method increases the computing speed by more than a hundredfold under the same errors. By computing the model equations of erbium-doped fiber amplifiers, the numerical results show that our method can improve the solution accuracy by over two orders of magnitude at the same amplifying section number. The proposed method has the capacity to rapidly and effectively compute the model equations of fiber Raman amplifiers and semiconductor lasers.

  17. Bearing defect signature analysis using advanced nonlinear signal analysis in a controlled environment

    NASA Technical Reports Server (NTRS)

    Zoladz, T.; Earhart, E.; Fiorucci, T.

    1995-01-01

    Utilizing high-frequency data from a highly instrumented rotor assembly, seeded bearing defect signatures are characterized using both conventional linear approaches, such as power spectral density analysis, and recently developed nonlinear techniques such as bicoherence analysis. Traditional low-frequency (less than 20 kHz) analysis and high-frequency envelope analysis of both accelerometer and acoustic emission data are used to recover characteristic bearing distress information buried deeply in acquired data. The successful coupling of newly developed nonlinear signal analysis with recovered wideband envelope data from accelerometers and acoustic emission sensors is the innovative focus of this research.

  18. Intraoral Digital Impressioning for Dental Implant Restorations Versus Traditional Implant Impression Techniques.

    PubMed

    Wilk, Brian L

    2015-01-01

    Over the course of the past two to three decades, intraoral digital impression systems have gained acceptance due to high accuracy and ease of use as they have been incorporated into the fabrication of dental implant restorations. The use of intraoral digital impressions enables the clinician to produce accurate restorations without the unpleasant aspects of traditional impression materials and techniques. This article discusses the various types of digital impression systems and their accuracy compared to traditional impression techniques. The cost, time, and patient satisfaction components of both techniques will also be reviewed.

  19. Evaluation of MALDI-TOF mass spectrometry for differentiation of Pichia kluyveri strains isolated from traditional fermentation processes.

    PubMed

    De la Torre González, Francisco Javier; Gutiérrez Avendaño, Daniel Oswaldo; Gschaedler Mathis, Anne Christine; Kirchmayr, Manuel Reinhart

    2018-06-06

    Non- Saccharomyces yeasts are widespread microorganisms and some time ago were considered contaminants in the beverage industry. However, nowadays they have gained importance for their ability to produce aromatic compounds, which in alcoholic beverages improves aromatic complexity and therefore the overall quality. Thus, identification and differentiation of the species involved in fermentation processes is vital and can be classified in traditional methods and techniques based on molecular biology. Traditional methods, however, can be expensive, laborious and/or unable to accurately discriminate on strain level. In the present study, a total of 19 strains of Pichia kluyveri isolated from mezcal, tejuino and cacao fermentations were analyzed with rep-PCR fingerprinting and matrix assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS). The comparative analysis between MS spectra and rep-PCR patterns obtained from these strains showed a high similarity between both methods. However, minimal differences between the obtained rep-PCR and MALDI-TOF MS clusters could be observed. The data shown suggests that MALDI-TOF MS is a promising alternative technique for rapid, reliable and cost-effective differentiation of natives yeast strains isolated from different traditional fermented foods and beverages. This article is protected by copyright. All rights reserved.

  20. Circular Dichroism Spectroscopy: Enhancing a Traditional Undergraduate Biochemistry Laboratory Experience

    ERIC Educational Resources Information Center

    Lewis, Russell L.; Seal, Erin L.; Lorts, Aimee R.; Stewart, Amanda L.

    2017-01-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they…

  1. Correlation Functions Aid Analyses Of Spectra

    NASA Technical Reports Server (NTRS)

    Beer, Reinhard; Norton, Robert H., Jr.

    1989-01-01

    New uses found for correlation functions in analyses of spectra. In approach combining elements of both pattern-recognition and traditional spectral-analysis techniques, spectral lines identified in data appear useless at first glance because they are dominated by noise. New approach particularly useful in measurement of concentrations of rare species of molecules in atmosphere.

  2. Price and Welfare Effects of Catastrophic Forest Damage from Southern Pine Beetle Epidemics

    Treesearch

    Thomas P. Holmes

    1991-01-01

    Southern pine beetle (Dendroctonus frontalis) epidemics are periodically responsible for catastrophic levels of mortality to southern yellow pine forests. Traditional forest damage appraisal techniques developed for site specific economic analysis are theoretically weak since they do not consider aggregate impacts across ecosystems and related markets. Because the...

  3. 3D digital image correlation methods for full-field vibration measurement

    NASA Astrophysics Data System (ADS)

    Helfrick, Mark N.; Niezrecki, Christopher; Avitabile, Peter; Schmidt, Timothy

    2011-04-01

    In the area of modal test/analysis/correlation, significant effort has been expended over the past twenty years in order to make reduced models and to expand test data for correlation and eventual updating of the finite element models. This has been restricted by vibration measurements which are traditionally limited to the location of relatively few applied sensors. Advances in computers and digital imaging technology have allowed 3D digital image correlation (DIC) methods to measure the shape and deformation of a vibrating structure. This technique allows for full-field measurement of structural response, thus providing a wealth of simultaneous test data. This paper presents some preliminary results for the test/analysis/correlation of data measured using the DIC approach along with traditional accelerometers and a scanning laser vibrometer for comparison to a finite element model. The results indicate that all three approaches correlated well with the finite element model and provide validation for the DIC approach for full-field vibration measurement. Some of the advantages and limitations of the technique are presented and discussed.

  4. Connected Text Reading and Differences in Text Reading Fluency in Adult Readers

    PubMed Central

    Wallot, Sebastian; Hollis, Geoff; van Rooij, Marieke

    2013-01-01

    The process of connected text reading has received very little attention in contemporary cognitive psychology. This lack of attention is in parts due to a research tradition that emphasizes the role of basic lexical constituents, which can be studied in isolated words or sentences. However, this lack of attention is in parts also due to the lack of statistical analysis techniques, which accommodate interdependent time series. In this study, we investigate text reading performance with traditional and nonlinear analysis techniques and show how outcomes from multiple analyses can used to create a more detailed picture of the process of text reading. Specifically, we investigate reading performance of groups of literate adult readers that differ in reading fluency during a self-paced text reading task. Our results indicate that classical metrics of reading (such as word frequency) do not capture text reading very well, and that classical measures of reading fluency (such as average reading time) distinguish relatively poorly between participant groups. Nonlinear analyses of distribution tails and reading time fluctuations provide more fine-grained information about the reading process and reading fluency. PMID:23977177

  5. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  6. Software Safety Analysis of a Flight Guidance System

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Technical Monitor); Tribble, Alan C.; Miller, Steven P.; Lempia, David L.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  7. Analysis of preparation of Chinese traditional medicine based on the fiber fingerprint drop trace

    NASA Astrophysics Data System (ADS)

    Zhang, Zhilin; Wang, Jialu; Sun, Weimin; Yan, Qi

    2010-11-01

    The purpose of the fiber micro-drop analyzing technique is to measure the characteristics of liquids using optical methods. The fiber fingerprint drop trace (FFDT) is a curve of light intensity vs. time. This curve indicates the forming, growing and dripping processes of the liquid drops. A pair of fibers was used to monitor the dripping process. The FFDTs are acquired and analyzed by a computer. Different liquid samples of many kinds of preparation of Chinese traditional medicines were tested by using the fiber micro-drop sensor in the experiments. The FFDTs of preparation of Chinese traditional medicines with different concentrations were analyzed in different ways. Considering the characters of the FFDTs, a novel method is proposed to measure the different preparation of Chinese traditional medicines and its concentration based on the corresponding relationship of FFDTs and the physical and chemical parameters of the liquids.

  8. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  9. Applications of HPLC/MS in the analysis of traditional Chinese medicines

    PubMed Central

    Li, Miao; Hou, Xiao-Fang; Zhang, Jie; Wang, Si-Cen; Fu, Qiang; He, Lang-Chong

    2012-01-01

    In China, traditional Chinese medicines (TCMs) have been used in clinical applications for thousands of years. The successful hyphenation of high-Performance liquid chromatography (HPLC) and mass spectrometry (MS) has been applied widely in TCMs and biological samples analysis. Undoubtedly, HPLC/MS technique has facilitated the understanding of the treatment mechanism of TCMs. We reviewed more than 350 published papers within the last 5 years on HPLC/MS in the analysis of TCMs. The present review focused on the applications of HPLC/MS in the component analysis, metabolites analysis, and pharmacokinetics of TCMs etc. 50% of the literature is related to the component analysis of TCMs, which show that this field is the most populär type of research. In the metabolites analysis, HPLC coupled with electrospray ionization quadrupole time-of-flight tandem mass spectrometry has been demonstrated to be the powerful tool for the characterization of structural features and fragmentation behavior patterns. This paper presented a brief overview of the applications of HPLC/MS in the analysis of TCMs. HPLC/MS in the fingerprint analysis is reviewed elsewhere. PMID:29403684

  10. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work ismore » focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.« less

  11. Influence of Early XXth Century Plastic Culture on Russian Design Formation

    NASA Astrophysics Data System (ADS)

    Surina, L.

    2017-11-01

    The paper deals with the analysis of the experimental research conducted by A.M. Rodchenko and Ya.G. Chernikov who became the ancestors of the Soviet design and substantiated the theory of composition, its main terminology as the basis for professional activities of a designer. It is established that a designer working with traditional or post-traditional visual communication relies on the compositional activity which is at the intersection of science and art and therefore has a dual nature. The preference of various compositional techniques and graphic means by the professional community determines the development direction of avant-garde design.

  12. Assessment of genetic relationship in Persea spp by traditional molecular markers.

    PubMed

    Reyes-Alemán, J C; Valadez-Moctezuma, E; Barrientos-Priego, A F

    2016-04-04

    Currently, the reclassification of the genus Persea is under discussion with molecular techniques for DNA analysis representing an alternative for inter- and intra-specific differentiation. In the present study, the traditional random-amplified polymorphic DNA (RAPD) and the inter simple sequence repeat (ISSR) markers were used to determine the genomic relationship of different species and hybrids representative of the subgenera Eriodaphne and Persea in a population conserved in a germplasm bank. The data were analyzed statistically using multivariate methods. In the RAPD analysis, a total of 190 polymorphic bands were produced, with an average of 23.7 bands per primer, the percentage contribution of each primer was from 7.66 to 19.63; the polymorphic information content (PIC) ranged from 0.23 to 0.45, with an average of 0.35. In the ISSR analysis, a total of 111 polymorphic bands were considered, with an average of 18.5 bands per primer, the percentage contribution of each was from 11.83 to 19.57; the PIC ranged from 0.35 to 0.48, with an average of 0.42. The phenograms obtained in each technique showed the relationship among the accessions through the clusters formed. In general, both the techniques grouped representatives of the Persea americana races (P. americana var. drymifolia, P. americana var. guatemalensis, and P. americana var. americana). However, it was not possible to separate the species of Persea used as reference into independent clades. In addition, they tended to separate the representatives of subgenera Eriodaphne and Persea.

  13. Femoral venous pressure waveform as indicator of phrenic nerve injury in the setting of second-generation cryoballoon ablation.

    PubMed

    Mugnai, Giacomo; de Asmundis, Carlo; Ströker, Erwin; Hünük, Burak; Moran, Darragh; Ruggiero, Diego; De Regibus, Valentina; Coutino-Moreno, Hugo Enrique; Takarada, Ken; Choudhury, Rajin; Poelaert, Jan; Verborgh, Christian; Brugada, Pedro; Chierchia, Gian-Battista

    2017-07-01

    Femoral venous pressure waveform (VPW) analysis has been recently described as a novel method to assess phrenic nerve function during atrial fibrillation ablation procedures by means of the cryoballoon technique. In this study, we sought to evaluate the feasibility and effectiveness of this technique, with respect to the incidence of phrenic nerve injury (PNI), in comparison with the traditional abdominal palpation technique alone. Consecutive patients undergoing second-generation cryoballoon ablation (CB-A) from June 2014 to June 2015 were retrospectively analyzed. Diagnosis of PNI was made if any reduced motility or paralysis of the hemidiaphragm was detected on fluoroscopy. During the study period, a total of 350 consecutive patients (man 67%, age 57.2 ± 12.9 years) were enrolled (200 using traditional phrenic nerve assessment and 150 using VPW monitoring). The incidence of PNI in the overall population was 8.0% (28/350); of these, eight were impending PNI (2.3%), 14 transient (4.0%), and six persistent (1.7%). Patients having undergone CB-A with traditional assessment experienced 18 phrenic nerve palsies (9.0%) vs two in 'VPW monitoring' group (1.3%; P = 0.002). Specifically, the former presented 12 transient (6.0%) and six persistent (3.0%) phrenic nerve palsies, and the latter exhibited two transient (1.3%; P = 0.03) and no persistent (0%; P = 0.04) phrenic nerve palsies. In conclusion, this novel method assessing the VPW for predicting PNI is inexpensive, easily available, with reproducible measurements, and appears to be more effective than traditional assessment methods.

  14. Analysis and application of intelligence network based on FTTH

    NASA Astrophysics Data System (ADS)

    Feng, Xiancheng; Yun, Xiang

    2008-12-01

    With the continued rapid growth of Internet, new network service emerges in endless stream, especially the increase of network game, meeting TV, video on demand, etc. The bandwidth requirement increase continuously. Network technique, optical device technical development is swift and violent. FTTH supports all present and future service with enormous bandwidth, including traditional telecommunication service, traditional data service and traditional TV service, and the future digital TV and VOD. With huge bandwidth of FTTH, it wins the final solution of broadband network, becomes the final goal of development of optical access network. Firstly, it introduces the main service which FTTH supports, main analysis key technology such as FTTH system composition way, topological structure, multiplexing, optical cable and device. It focus two kinds of realization methods - PON, P2P technology. Then it proposed that the solution of FTTH can support comprehensive access (service such as broadband data, voice, video and narrowband private line). Finally, it shows the engineering application for FTTH in the district and building. It brings enormous economic benefits and social benefit.

  15. Multimodal Pressure Flow Analysis: Application of Hilbert Huang Transform in Cerebral Blood Flow Regulation

    PubMed Central

    Lo, Men-Tzung; Hu, Kun; Liu, Yanhui; Peng, C.-K.; Novak, Vera

    2008-01-01

    Quantification of nonlinear interactions between two nonstationary signals presents a computational challenge in different research fields, especially for assessments of physiological systems. Traditional approaches that are based on theories of stationary signals cannot resolve nonstationarity-related issues and, thus, cannot reliably assess nonlinear interactions in physiological systems. In this review we discuss a new technique “Multi-Modal Pressure Flow method (MMPF)” that utilizes Hilbert-Huang transformation to quantify dynamic cerebral autoregulation (CA) by studying interaction between nonstationary cerebral blood flow velocity (BFV) and blood pressure (BP). CA is an important mechanism responsible for controlling cerebral blood flow in responses to fluctuations in systemic BP within a few heart-beats. The influence of CA is traditionally assessed from the relationship between the well-pronounced systemic BP and BFV oscillations induced by clinical tests. Reliable noninvasive assessment of dynamic CA, however, remains a challenge in clinical and diagnostic medicine. In this brief review we: 1) present an overview of transfer function analysis (TFA) that is traditionally used to quantify CA; 2) describe the a MMPF method and its modifications; 3) introduce a newly developed automatic algorithm and engineering aspects of the improved MMPF method; and 4) review clinical applications of MMPF and its sensitivity for detection of CA abnormalities in clinical studies. The MMPF analysis decomposes complex nonstationary BP and BFV signals into multiple empirical modes adaptively so that the fluctuations caused by a specific physiologic process can be represented in a corresponding empirical mode. Using this technique, we recently showed that dynamic CA can be characterized by specific phase delays between the decomposed BP and BFV oscillations, and that the phase shifts are significantly reduced in hypertensive, diabetics and stroke subjects with impaired CA. In addition, the new technique enables reliable assessment of CA using both data collected during clinical test and spontaneous BP/BFV fluctuations during baseline resting conditions. PMID:18725996

  16. Probing the critical zone using passive- and active-source estimates of subsurface shear-wave velocities

    NASA Astrophysics Data System (ADS)

    Callahan, R. P.; Taylor, N. J.; Pasquet, S.; Dueker, K. G.; Riebe, C. S.; Holbrook, W. S.

    2016-12-01

    Geophysical imaging is rapidly becoming popular for quantifying subsurface critical zone (CZ) architecture. However, a diverse array of measurements and measurement techniques are available, raising the question of which are appropriate for specific study goals. Here we compare two techniques for measuring S-wave velocities (Vs) in the near surface. The first approach quantifies Vs in three dimensions using a passive source and an iterative residual least-squares tomographic inversion. The second approach uses a more traditional active-source seismic survey to quantify Vs in two dimensions via a Monte Carlo surface-wave dispersion inversion. Our analysis focuses on three 0.01 km2 study plots on weathered granitic bedrock in the Southern Sierra Critical Zone Observatory. Preliminary results indicate that depth-averaged velocities from the two methods agree over the scales of resolution of the techniques. While the passive- and active-source techniques both quantify Vs, each method has distinct advantages and disadvantages during data acquisition and analysis. The passive-source method has the advantage of generating a three dimensional distribution of subsurface Vs structure across a broad area. Because this method relies on the ambient seismic field as a source, which varies unpredictably across space and time, data quality and depth of investigation are outside the control of the user. Meanwhile, traditional active-source surveys can be designed around a desired depth of investigation. However, they only generate a two dimensional image of Vs structure. Whereas traditional active-source surveys can be inverted quickly on a personal computer in the field, passive source surveys require significantly more computations, and are best conducted in a high-performance computing environment. We use data from our study sites to compare these methods across different scales and to explore how these methods can be used to better understand subsurface CZ architecture.

  17. Forest control and regulation ... a comparison of traditional methods and alternatives

    Treesearch

    LeRoy C. Hennes; Michael J. Irving; Daniel I. Navon

    1971-01-01

    Two traditional techniques of forest control and regulation-formulas and area-volume check-are compared to linear programing, as used in a new computerized planning system called Timber Resource Allocation Method ( Timber RAM). Inventory data from a National Forest in California illustrate how each technique is used. The traditional methods are simpler to apply and...

  18. Traditional, complementary, and alternative medicine: Focusing on research into traditional Tibetan medicine in China.

    PubMed

    Song, Peipei; Xia, Jufeng; Rezeng, Caidan; Tong, Li; Tang, Wei

    2016-07-19

    As a form of traditional, complementary, and alternative medicine (TCAM), traditional Tibetan medicine has developed into a mainstay of medical care in Tibet and has spread from there to China and then to the rest of the world. Thus far, research on traditional Tibetan medicine has focused on the study of the plant and animal sources of traditional medicines, study of the histology of those plants and animals, chemical analysis of traditional medicines, pharmacological study of those medicines, and evaluation of the clinical efficacy of those medicines. A number of papers on traditional Tibetan medicines have been published, providing some evidence of the efficacy of traditional Tibetan medicine. However, many traditional Tibetan medicines have unknown active ingredients, hampering the establishment of drug quality standards, the development of new medicines, commercial production of medicines, and market availability of those medicines. Traditional Tibetan medicine must take several steps to modernize and spread to the rest of the world: the pharmacodynamics of traditional Tibetan medicines need to be determined, the clinical efficacy of those medicines needs to be verified, criteria to evaluate the efficacy of those medicines need to be established in order to guide their clinical use, and efficacious medicines need to be acknowledged by the pharmaceutical market. The components of traditional Tibetan medicine should be studied, traditional Tibetan medicines should be screened for their active ingredients, and techniques should be devised to prepare and manufacture those medicines.

  19. Barriers to biomedical care and use of traditional medicines for treatment of cervical cancer: an exploratory qualitative study in northern Uganda.

    PubMed

    Mwaka, A D; Okello, E S; Orach, C G

    2015-07-01

    Use of traditional medicines for treatment of cancers has increased worldwide. We used a qualitative approach to explore barriers to biomedical care and reasons for use of traditional medicines for the treatment of cervical cancer in Gulu, northern Uganda. We carried out 24 focus group discussions involving men and women aged 18-59 years. We employed content analyses technique in data analysis. Traditional medicines were used mainly due to barriers to biomedical care for cervical cancer. The barriers included health system factors, for example long distances to health facilities and unavailability of medicines; health workers' factors, for example negative attitudes towards patients and demands for bribes; individual patient's factors, for example inability to pay for medical care; and socio-cultural beliefs about superiority of traditional medicines and perceived greater privacy in accessing traditional healers. Barriers to biomedical care and community beliefs in the effectiveness of traditional medicines encourage use of traditional medicines for treatment of cervical cancer but might hinder help-seeking at biomedical facilities. There is need for targeted culturally sensitive awareness campaign to promote effectiveness of modern medicine and to encourage cautious use of traditional medicines in the treatment of cervical cancer. © 2014 The Authors. European Journal of Cancer Care published by John Wiley & Sons Ltd.

  20. Barriers to biomedical care and use of traditional medicines for treatment of cervical cancer: an exploratory qualitative study in northern Uganda

    PubMed Central

    Mwaka, A.D.; Okello, E.S.; Orach, C.G.

    2016-01-01

    Use of traditional medicines for treatment of cancers has increased worldwide. We used a qualitative approach to explore barriers to biomedical care and reasons for use of traditional medicines for the treatment of cervical cancer in Gulu, northern Uganda. We carried out 24 focus group discussions involving men and women aged 18–59 years. We employed content analyses technique in data analysis. Traditional medicines were used mainly due to barriers to biomedical care for cervical cancer. The barriers included health system factors, for example long distances to health facilities and unavailability of medicines; health workers’ factors, for example negative attitudes towards patients and demands for bribes; individual patient’s factors, for example inability to pay for medical care; and socio-cultural beliefs about superiority of traditional medicines and perceived greater privacy in accessing traditional healers. Barriers to biomedical care and community beliefs in the effectiveness of traditional medicines encourage use of traditional medicines for treatment of cervical cancer but might hinder help-seeking at biomedical facilities. There is need for targeted culturally sensitive awareness campaign to promote effectiveness of modern medicine and to encourage cautious use of traditional medicines in the treatment of cervical cancer. PMID:24923866

  1. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erskine, D J; Smith, R F; Bolme, C

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISARmore » optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.« less

  2. Automated brainstem co-registration (ABC) for MRI.

    PubMed

    Napadow, Vitaly; Dhond, Rupali; Kennedy, David; Hui, Kathleen K S; Makris, Nikos

    2006-09-01

    Group data analysis in brainstem neuroimaging is predicated on accurate co-registration of anatomy. As the brainstem is comprised of many functionally heterogeneous nuclei densely situated adjacent to one another, relatively small errors in co-registration can manifest in increased variance or decreased sensitivity (or significance) in detecting activations. We have devised a 2-stage automated, reference mask guided registration technique (Automated Brainstem Co-registration, or ABC) for improved brainstem co-registration. Our approach utilized a brainstem mask dataset to weight an automated co-registration cost function. Our method was validated through measurement of RMS error at 12 manually defined landmarks. These landmarks were also used as guides for a secondary manual co-registration option, intended for outlier individuals that may not adequately co-register with our automated method. Our methodology was tested on 10 healthy human subjects and compared to traditional co-registration techniques (Talairach transform and automated affine transform to the MNI-152 template). We found that ABC had a significantly lower mean RMS error (1.22 +/- 0.39 mm) than Talairach transform (2.88 +/- 1.22 mm, mu +/- sigma) and the global affine (3.26 +/- 0.81 mm) method. Improved accuracy was also found for our manual-landmark-guided option (1.51 +/- 0.43 mm). Visualizing individual brainstem borders demonstrated more consistent and uniform overlap for ABC compared to traditional global co-registration techniques. Improved robustness (lower susceptibility to outliers) was demonstrated with ABC through lower inter-subject RMS error variance compared with traditional co-registration methods. The use of easily available and validated tools (AFNI and FSL) for this method should ease adoption by other investigators interested in brainstem data group analysis.

  3. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    NASA Astrophysics Data System (ADS)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  4. Utilising three-dimensional printing techniques when providing unique assistive devices: A case report.

    PubMed

    Day, Sarah Jane; Riley, Shaun Patrick

    2018-02-01

    The evolution of three-dimensional printing into prosthetics has opened conversations about the availability and cost of prostheses. This report will discuss how a prosthetic team incorporated additive manufacture techniques into the treatment of a patient with a partial hand amputation to create and test a unique assistive device which he could use to hold his French horn. Case description and methods: Using a process of shape capture, photogrammetry, computer-aided design and finite element analysis, a suitable assistive device was designed and tested. The design was fabricated using three-dimensional printing. Patient satisfaction was measured using a Pugh's Matrix™, and a cost comparison was made between the process used and traditional manufacturing. Findings and outcomes: Patient satisfaction was high. The three-dimensional printed devices were 56% cheaper to fabricate than a similar laminated device. Computer-aided design and three-dimensional printing proved to be an effective method for designing, testing and fabricating a unique assistive device. Clinical relevance CAD and 3D printing techniques can enable devices to be designed, tested and fabricated cheaper than when using traditional techniques. This may lead to improvements in quality and accessibility.

  5. Definition of the supraclavicular and infraclavicular nodes: implications for three-dimensional CT-based conformal radiation therapy.

    PubMed

    Madu, C N; Quint, D J; Normolle, D P; Marsh, R B; Wang, E Y; Pierce, L J

    2001-11-01

    To delineate with computed tomography (CT) the anatomic regions containing the supraclavicular (SCV) and infraclavicular (IFV) nodal groups, to define the course of the brachial plexus, to estimate the actual radiation dose received by these regions in a series of patients treated in the traditional manner, and to compare these doses to those received with an optimized dosimetric technique. Twenty patients underwent contrast material-enhanced CT for the purpose of radiation therapy planning. CT scans were used to study the location of the SCV and IFV nodal regions by using outlining of readily identifiable anatomic structures that define the nodal groups. The brachial plexus was also outlined by using similar methods. Radiation therapy doses to the SCV and IFV were then estimated by using traditional dose calculations and optimized planning. A repeated measures analysis of covariance was used to compare the SCV and IFV depths and to compare the doses achieved with the traditional and optimized methods. Coverage by the 90% isodose surface was significantly decreased with traditional planning versus conformal planning as the depth to the SCV nodes increased (P < .001). Significantly decreased coverage by using the 90% isodose surface was demonstrated for traditional planning versus conformal planning with increasing IFV depth (P = .015). A linear correlation was found between brachial plexus depth and SCV depth up to 7 cm. Conformal optimized planning provided improved dosimetric coverage compared with standard techniques.

  6. An automated technique to identify potential inappropriate traditional Chinese medicine (TCM) prescriptions.

    PubMed

    Yang, Hsuan-Chia; Iqbal, Usman; Nguyen, Phung Anh; Lin, Shen-Hsien; Huang, Chih-Wei; Jian, Wen-Shan; Li, Yu-Chuan

    2016-04-01

    Medication errors such as potential inappropriate prescriptions would induce serious adverse drug events to patients. Information technology has the ability to prevent medication errors; however, the pharmacology of traditional Chinese medicine (TCM) is not as clear as in western medicine. The aim of this study was to apply the appropriateness of prescription (AOP) model to identify potential inappropriate TCM prescriptions. We used the association rule of mining techniques to analyze 14.5 million prescriptions from the Taiwan National Health Insurance Research Database. The disease and TCM (DTCM) and traditional Chinese medicine-traditional Chinese medicine (TCMM) associations are computed by their co-occurrence, and the associations' strength was measured as Q-values, which often referred to as interestingness or life values. By considering the number of Q-values, the AOP model was applied to identify the inappropriate prescriptions. Afterwards, three traditional Chinese physicians evaluated 1920 prescriptions and validated the detected outcomes from the AOP model. Out of 1920 prescriptions, 97.1% of positive predictive value and 19.5% of negative predictive value were shown by the system as compared with those by experts. The sensitivity analysis indicated that the negative predictive value could improve up to 27.5% when the model's threshold changed to 0.4. We successfully applied the AOP model to automatically identify potential inappropriate TCM prescriptions. This model could be a potential TCM clinical decision support system in order to improve drug safety and quality of care. Copyright © 2016 John Wiley & Sons, Ltd.

  7. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  8. Traditional versus rule-based programming techniques - Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    A traditional programming technique for controlling the display of optional flight information in a civil transport cockpit is compared to a rule-based technique for the same function. This application required complex decision logic and a frequently modified rule base. The techniques are evaluated for execution efficiency and implementation ease; the criterion used to calculate the execution efficiency is the total number of steps required to isolate hypotheses that were true and the criteria used to evaluate the implementability are ease of modification and verification and explanation capability. It is observed that the traditional program is more efficient than the rule-based program; however, the rule-based programming technique is more applicable for improving programmer productivity.

  9. A ten-week biochemistry lab project studying wild-type and mutant bacterial alkaline phosphatase.

    PubMed

    Witherow, D Scott

    2016-11-12

    This work describes a 10-week laboratory project studying wild-type and mutant bacterial alkaline phosphatase, in which students purify, quantitate, and perform kinetic assays on wild-type and selected mutants of the enzyme. Students also perform plasmid DNA purification, digestion, and gel analysis. In addition to simply learning important techniques, students acquire novel biochemical data in their kinetic analysis of mutant enzymes. The experiments are designed to build on students' work from week to week in a way that requires them to apply quantitative analysis and reasoning skills, reinforcing traditional textbook biochemical concepts. Students are assessed through lab reports focused on journal style writing, quantitative and conceptual question sheets, and traditional exams. © 2016 by The International Union of Biochemistry and Molecular Biology, 44(6):555-564, 2016. © 2016 The International Union of Biochemistry and Molecular Biology.

  10. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  11. Traditional Admissions Variables as Predictors of Minority Students' Performance in Medical School — A Cause for Concern

    PubMed Central

    Johnson, Henry C.; Rosevear, G. Craig

    1977-01-01

    This study explored the relationship between traditional admissions criteria, performance in the first semester of medical school, and performance on the National Board of Medical Examiners' (NBME) Examination, Part 1 for minority medical students, non-minority medical students, and the two groups combined. Correlational analysis and step-wise multiple regression procedures were used as the analysis techniques. A different pattern of admissions variables related to National Board Part 1 performance for the two groups. The General Information section of the Medical College Admission Test (MCAT) contributed the most variance for the minority student group. MCAT-Science contributed the most variance for the non-minority student group. MCATs accounted for a substantial portion of the variance on the National Board examination. PMID:904005

  12. Classroom Activities: Simple Strategies to Incorporate Student-Centered Activities within Undergraduate Science Lectures

    PubMed Central

    Lom, Barbara

    2012-01-01

    The traditional science lecture, where an instructor delivers a carefully crafted monolog to a large audience of students who passively receive the information, has been a popular mode of instruction for centuries. Recent evidence on the science of teaching and learning indicates that learner-centered, active teaching strategies can be more effective learning tools than traditional lectures. Yet most colleges and universities retain lectures as their central instructional method. This article highlights several simple collaborative teaching techniques that can be readily deployed within traditional lecture frameworks to promote active learning. Specifically, this article briefly introduces the techniques of: reader’s theatre, think-pair-share, roundtable, jigsaw, in-class quizzes, and minute papers. Each technique is broadly applicable well beyond neuroscience courses and easily modifiable to serve an instructor’s specific pedagogical goals. The benefits of each technique are described along with specific examples of how each technique might be deployed within a traditional lecture to create more active learning experiences. PMID:23494568

  13. In Vitro Toxicity Assessment Technique for Volatile ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency is tasked with evaluating the human health, environmental, and wildlife effects of over 80,000 chemicals registered for use in the environment and commerce. The challenge is that sparse chemical data exists; traditional toxicity testing methods are slow, costly, involve animal studies, and cannot keep up with a chemical registry that typically grows by at least 1000 chemicals every year. In recent years, High Throughput Screening (HTS) has been used in order to prioritize chemicals for traditional toxicity screening or to complement traditional toxicity studies. HTS is an in vitro approach of rapidly assaying a large number of chemicals for biochemical activity using robotics and automation. However, no method currently exists for screening volatile chemicals such as air pollutants in a HTS fashion. Additionally, significant uncertainty regarding in vitro to in in vivo extrapolation (IVIVE) remains. An approach to bridge the IVIVE gap and the current lack of ability to screen volatile chemicals in a HTS fashion is by using a probe molecule (PrM) technique. The proposed technique uses chemicals with empirical human pharmacokinetic data as PrMs to study toxicity of molecules with no known data for gas-phase analysis. We are currently studying the xenobiotic-metabolizing enzyme CYP2A6 using transfected BEAS-2B bronchial epithelial cell line. The CYP2A6 pathway activity is studied by the formation of cotinine from nicot

  14. Using AVIRIS data and multiple-masking techniques to map urban forest trees species

    Treesearch

    Q. Xiao; S.L. Ustin; E.G. McPherson

    2004-01-01

    Tree type and species information are critical parameters for urban forest management, benefit cost analysis and urban planning. However, traditionally, these parameters have been derived based on limited field samples in urban forest management practice. In this study we used high-resolution Airborne Visible Infrared Imaging Spectrometer (AVIRIS) data and multiple-...

  15. Improving Iranian High School Students' Reading Comprehension Using the Tenets of Genre Analysis

    ERIC Educational Resources Information Center

    Adelnia, Rezvan; Salehi, Hadi

    2016-01-01

    This study is an attempt to investigate impact of using a technique, namely, genre-based approach on improving reading ability on Iranian EFL learners' achievement. Therefore, an attempt was made to compare genre-based approach to teaching reading with traditional approaches. For achieving this purpose, by administering the Oxford Quick Placement…

  16. ANALYSIS OF 209 CHLORINATED BIPHENYL CONGENERS USING COMPREHENSIVE TWO-DIMENSIONAL GAS CHROMATOGRAPHY-TIME-OF-FLIGHT MASS SPECTROMETRY IN THE 1-D MODE FOLLOWED BY THE 2-D MODE

    EPA Science Inventory

    Since the initial discovery of polychlorinated biphenyls (PCBs) in the environment, the detection and identification of certain PCB congeners using the traditional one dimensional (1-D) chromatographic technique has been very challenging, especially, separating the 46 isomeric pe...

  17. Comparative Analysis of the Effectiveness of Oral vs. Podcasting Reviewing Techniques

    ERIC Educational Resources Information Center

    Rhoads, Misty L.

    2010-01-01

    The purpose of this study was to compare the use of podcasts to traditional delivery of information in classrooms. Four podcasts were created on the topics of asthma, diabetes, seizure disorders, and acute infections to aid students in reviewing for quizzes. Knowledge retained of students using podcasts was compared to the knowledge retained of…

  18. Clustering Binary Data in the Presence of Masking Variables

    ERIC Educational Resources Information Center

    Brusco, Michael J.

    2004-01-01

    A number of important applications require the clustering of binary data sets. Traditional nonhierarchical cluster analysis techniques, such as the popular K-means algorithm, can often be successfully applied to these data sets. However, the presence of masking variables in a data set can impede the ability of the K-means algorithm to recover the…

  19. Are Student Evaluations of Teaching Effectiveness Valid for Measuring Student Learning Outcomes in Business Related Classes? A Neural Network and Bayesian Analyses

    ERIC Educational Resources Information Center

    Galbraith, Craig S.; Merrill, Gregory B.; Kline, Doug M.

    2012-01-01

    In this study we investigate the underlying relational structure between student evaluations of teaching effectiveness (SETEs) and achievement of student learning outcomes in 116 business related courses. Utilizing traditional statistical techniques, a neural network analysis and a Bayesian data reduction and classification algorithm, we find…

  20. What does nonforest land contribute to the global carbon balance?

    Treesearch

    Jennifer C. Jenkins; Rachel Riemann

    2002-01-01

    An inventory of land traditionally called "nonforest" and therefore not sampled by the Forest Inventory and Analysis (FIA) program was implemented by the FIA unit at the Northeastern Station in 1999 for five counties in Maryland. Biomass and biomass increment were estimated from the nonforest inventory data using techniques developed for application to large-...

  1. Assessing tree and stand biomass: a review with examples and critical comparisons

    Treesearch

    Bernard R. Parresol

    1999-01-01

    There is considerable interest today in estimating the biomass of trees and forests for both practical forestry issues and scientific purposes. New techniques and procedures are brought together along with the more traditional approaches to estimating woody biomass. General model forms and weighted analysis are reviewed, along with statistics for evaluating and...

  2. Evaluation of Learning Unit Design with Use of Page Flip Information Analysis

    ERIC Educational Resources Information Center

    Horikoshi, Izumi; Noguchi, Masato; Tamura, Yasuhisa

    2016-01-01

    In this paper, the authors attempted to evaluate design of leaning units with use of Learning Analytics technique on page flip information. Traditional formative assessment has been carried out by giving assignments and evaluating their results. However, the information that teacher can get from the evaluation is limited and coarse-grained. The…

  3. Numerical characterization of landing gear aeroacoustics using advanced simulation and analysis techniques

    NASA Astrophysics Data System (ADS)

    Redonnet, S.; Ben Khelil, S.; Bulté, J.; Cunha, G.

    2017-09-01

    With the objective of aircraft noise mitigation, we here address the numerical characterization of the aeroacoustics by a simplified nose landing gear (NLG), through the use of advanced simulation and signal processing techniques. To this end, the NLG noise physics is first simulated through an advanced hybrid approach, which relies on Computational Fluid Dynamics (CFD) and Computational AeroAcoustics (CAA) calculations. Compared to more traditional hybrid methods (e.g. those relying on the use of an Acoustic Analogy), and although it is used here with some approximations made (e.g. design of the CFD-CAA interface), the present approach does not rely on restrictive assumptions (e.g. equivalent noise source, homogeneous propagation medium), which allows to incorporate more realism into the prediction. In a second step, the outputs coming from such CFD-CAA hybrid calculations are processed through both traditional and advanced post-processing techniques, thus offering to further investigate the NLG's noise source mechanisms. Among other things, this work highlights how advanced computational methodologies are now mature enough to not only simulate realistic problems of airframe noise emission, but also to investigate their underlying physics.

  4. Self-Reported Alcohol Consumption and Sexual Behavior in Males and Females: Using the Unmatched-Count Technique to Examine Reporting Practices of Socially Sensitive Subjects in a Sample of University Students

    ERIC Educational Resources Information Center

    Walsh, Jeffrey A.; Braithwaite, Jeremy

    2008-01-01

    This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…

  5. Radio detection of extensive air showers

    NASA Astrophysics Data System (ADS)

    Huege, Tim

    2017-12-01

    Radio detection of extensive air showers initiated in the Earth's atmosphere has made tremendous progress in the last decade. Today, radio detection is routinely used in several cosmic-ray observatories. The physics of the radio emission in air showers is well-understood, and analysis techniques have been developed to determine the arrival direction, the energy and an estimate for the mass of the primary particle from the radio measurements. The achieved resolutions are competitive with those of more traditional techniques. In this article, I shortly review the most important achievements and discuss the potential for future applications.

  6. [Comparative trial between traditional cesarean section and Misgav-Ladach technique].

    PubMed

    Gutiérrez, José Gabriel Tamayo; Coló, José Antonio Sereno; Arreola, María Sandra Huape

    2008-02-01

    The cesarean section was designed to extract to the neoborn, when the childbirth becomes difficult by the natural routes. The institutional obstetrical work demands long surgical time and high raw materials; therefore, simpler procedures must be implemented. To compare traditional cesarean section vs Misgav-Ladach technique to assess surgical time, and hospital stay and costs. Forty-eight pregnant patients at term with obstetrical indication for cesarean delivery were randomized in two groups: 24 were submitted to traditional cesarean and 24 to Misgav-Ladach technique. The outcomes included surgical time, bleeding, amount of sutures employed, pain intensity and some others adverse effects. The surgical time with Misgav-Ladach technique was shorter compared with traditional cesarean section, bleeding was consistently lesser and pain was also low. None adverse effects were registered in both groups. Although short follow-up showed significant operative time reduction and less bleeding, longer follow-up should be desirable in order to confirm no abdominal adhesions.

  7. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  8. Nondestructive surface analysis for material research using fiber optic vibrational spectroscopy

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Natalia I.

    2001-11-01

    The advanced methods of fiber optical vibrational spectroscopy (FOVS) has been developed in conjunction with interferometer and low-loss, flexible, and nontoxic optical fibers, sensors, and probes. The combination of optical fibers and sensors with Fourier Transform (FT) spectrometer has been used in the range from 2.5 to 12micrometers . This technique serves as an ideal diagnostic tool for surface analysis of numerous and various diverse materials such as complex structured materials, fluids, coatings, implants, living cells, plants, and tissue. Such surfaces as well as living tissue or plants are very difficult to investigate in vivo by traditional FT infrared or Raman spectroscopy methods. The FOVS technique is nondestructive, noninvasive, fast (15 sec) and capable of operating in remote sampling regime (up to a fiber length of 3m). Fourier transform infrared (FTIR) and Raman fiber optic spectroscopy operating with optical fibers has been suggested as a new powerful tool. These techniques are highly sensitive techniques for structural studies in material research and various applications during process analysis to determine molecular composition, chemical bonds, and molecular conformations. These techniques could be developed as a new tool for quality control of numerous materials as well as noninvasive biopsy.

  9. Comparative study of presurgical hand hygiene with hydroalcoholic solution versus traditional presurgical hand hygiene.

    PubMed

    López Martín, M Beatriz; Erice Calvo-Sotelo, Alejo

    To compare presurgical hand hygiene with hydroalcoholic solution following the WHO protocol with traditional presurgical hand hygiene. Cultures of the hands of surgeons and surgical nurses were performed before and after presurgical hand hygiene and after removing gloves at the end of surgery. Cultures were done in 2different days: the first day after traditional presurgical hand hygiene, and the second day after presurgical hand hygiene with hydroalcoholic solution following the WHO protocol. The duration of the traditional hand hygiene was measured and compared with the duration (3min) of the WHO protocol. The cost of the products used in the traditional technique was compared with the cost of the hydroalcoholic solution used. The variability of the traditional technique was determined by observation. Following presurgical hand hygiene with hydroalcoholic solution, colony-forming units (CFU) were detected in 5 (7.3%) subjects, whereas after traditional presurgical hand hygiene CFU were detected in 14 subjects (20.5%) (p < 0.05). After glove removal, the numbers of CFU were similar. The time employed in hand hygiene with hydroalcoholic solution (3min) was inferior to the time employed in the traditional technique (p < 0.05), its cost was less than half, and there was no variability. Compared with other techniques, presurgical hand hygiene with hydroalcoholic solution significantly decreases CFU, has similar latency time, a lower cost, and saves time. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  10. [Analysis of syndrome discipline of generalized anxiety disorder using data mining techniques].

    PubMed

    Tang, Qi-sheng; Sun, Wen-jun; Qu, Miao; Guo, Dong-fang

    2012-09-01

    To study the use of data mining techniques in analyzing the syndrome discipline of generalized anxiety disorder (GAD). From August 1, 2009 to July 31, 2010, 705 patients with GAD in 10 hospitals of Beijing were investigated over one year. Data mining techniques, such as Bayes net and cluster analysis, were used to analyze the syndrome discipline of GAD. A total of 61 symptoms of GAD were screened out. By using Bayes net, nine syndromes of GAD were abstracted based on the symptoms. Eight syndromes were abstracted by cluster analysis. After screening for duplicate syndromes and combining the experts' experience and traditional Chinese medicine theory, six syndromes of GAD were defined. These included depressed liver qi transforming into fire, phlegm-heat harassing the heart, liver depression and spleen deficiency, heart-kidney non-interaction, dual deficiency of the heart and spleen, and kidney deficiency and liver yang hyperactivity. Based on the results, the draft of Syndrome Diagnostic Criteria for Generalized Anxiety Disorder was developed. Data mining techniques such as Bayes net and cluster analysis have certain future potential for establishing syndrome models and analyzing syndrome discipline, thus they are suitable for the research of syndrome differentiation.

  11. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  12. The discrimination of honey origin using melissopalynology and Raman spectroscopy techniques coupled with multivariate analysis.

    PubMed

    Corvucci, Francesca; Nobili, Lara; Melucci, Dora; Grillenzoni, Francesca-Vittoria

    2015-02-15

    Honey traceability to food quality is required by consumers and food control institutions. Melissopalynologists traditionally use percentages of nectariferous pollens to discriminate the botanical origin and the entire pollen spectrum (presence/absence, type and quantities and association of some pollen types) to determinate the geographical origin of honeys. To improve melissopalynological routine analysis, principal components analysis (PCA) was used. A remarkable and innovative result was that the most significant pollens for the traditional discrimination of the botanical and geographical origin of honeys were the same as those individuated with the chemometric model. The reliability of assignments of samples to honey classes was estimated through explained variance (85%). This confirms that the chemometric model properly describes the melissopalynological data. With the aim to improve honey discrimination, FT-microRaman spectrography and multivariate analysis were also applied. Well performing PCA models and good agreement with known classes were achieved. Encouraging results were obtained for botanical discrimination. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Robust-mode analysis of hydrodynamic flows

    NASA Astrophysics Data System (ADS)

    Roy, Sukesh; Gord, James R.; Hua, Jia-Chen; Gunaratne, Gemunu H.

    2017-04-01

    The emergence of techniques to extract high-frequency high-resolution data introduces a new avenue for modal decomposition to assess the underlying dynamics, especially of complex flows. However, this task requires the differentiation of robust, repeatable flow constituents from noise and other irregular features of a flow. Traditional approaches involving low-pass filtering and principle components analysis have shortcomings. The approach outlined here, referred to as robust-mode analysis, is based on Koopman decomposition. Three applications to (a) a counter-rotating cellular flame state, (b) variations in financial markets, and (c) turbulent injector flows are provided.

  14. The remote supervisory and controlling experiment system of traditional Chinese medicine production based on Fieldbus

    NASA Astrophysics Data System (ADS)

    Zhan, Jinliang; Lu, Pei

    2006-11-01

    Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.

  15. Comparison of a new hydro-surgical technique to traditional methods for the preparation of full-thickness skin grafts from canine cadaveric skin and report of a single clinical case.

    PubMed

    Townsend, F I; Ralphs, S C; Coronado, G; Sweet, D C; Ward, J; Bloch, C P

    2012-01-01

    To compare the hydro-surgical technique to traditional techniques for removal of subcutaneous tissue in the preparation of full-thickness skin grafts. Ex vivo experimental study and a single clinical case report. Four canine cadavers and a single clinical case. Four sections of skin were harvested from the lateral flank of recently euthanatized dogs. Traditional preparation methods used included both a blade or scissors technique, each of which were compared to the hydro-surgical technique individually. Preparation methods were compared based on length of time for removal of the subcutaneous tissue from the graft, histologic grading, and measurable thickness as compared to an untreated sample. The hydro-surgical technique had the shortest skin graft preparation time as compared to traditional techniques (p = 0.002). There was no significant difference in the histological grading or measurable subcutaneous thickness between skin specimens. The hydro-surgical technique provides a rapid, effective debridement of subcutaneous tissue in the preparation of full-thickness skin grafts. There were not any significant changes in histological grade and subcutaneous tissue remaining among all treatment types. Additionally the hydro-surgical technique was successfully used to prepare a full-thickness meshed free skin graft in the reconstruction of a traumatic medial tarsal wound in a dog.

  16. Acceleration of atmospheric Cherenkov telescope signal processing to real-time speed with the Auto-Pipe design system

    NASA Astrophysics Data System (ADS)

    Tyson, Eric J.; Buckley, James; Franklin, Mark A.; Chamberlain, Roger D.

    2008-10-01

    The imaging atmospheric Cherenkov technique for high-energy gamma-ray astronomy is emerging as an important new technique for studying the high energy universe. Current experiments have data rates of ≈20TB/year and duty cycles of about 10%. In the future, more sensitive experiments may produce up to 1000 TB/year. The data analysis task for these experiments requires keeping up with this data rate in close to real-time. Such data analysis is a classic example of a streaming application with very high performance requirements. This class of application often benefits greatly from the use of non-traditional approaches for computation including using special purpose hardware (FPGAs and ASICs), or sophisticated parallel processing techniques. However, designing, debugging, and deploying to these architectures is difficult and thus they are not widely used by the astrophysics community. This paper presents the Auto-Pipe design toolset that has been developed to address many of the difficulties in taking advantage of complex streaming computer architectures for such applications. Auto-Pipe incorporates a high-level coordination language, functional and performance simulation tools, and the ability to deploy applications to sophisticated architectures. Using the Auto-Pipe toolset, we have implemented the front-end portion of an imaging Cherenkov data analysis application, suitable for real-time or offline analysis. The application operates on data from the VERITAS experiment, and shows how Auto-Pipe can greatly ease performance optimization and application deployment of a wide variety of platforms. We demonstrate a performance improvement over a traditional software approach of 32x using an FPGA solution and 3.6x using a multiprocessor based solution.

  17. Quantitative Assessment of Blood Pressure Measurement Accuracy and Variability from Visual Auscultation Method by Observers without Receiving Medical Training

    PubMed Central

    Feng, Yong; Chen, Aiqing

    2017-01-01

    This study aimed to quantify blood pressure (BP) measurement accuracy and variability with different techniques. Thirty video clips of BP recordings from the BHS training database were converted to Korotkoff sound waveforms. Ten observers without receiving medical training were asked to determine BPs using (a) traditional manual auscultatory method and (b) visual auscultation method by visualizing the Korotkoff sound waveform, which was repeated three times on different days. The measurement error was calculated against the reference answers, and the measurement variability was calculated from the SD of the three repeats. Statistical analysis showed that, in comparison with the auscultatory method, visual method significantly reduced overall variability from 2.2 to 1.1 mmHg for SBP and from 1.9 to 0.9 mmHg for DBP (both p < 0.001). It also showed that BP measurement errors were significant for both techniques (all p < 0.01, except DBP from the traditional method). Although significant, the overall mean errors were small (−1.5 and −1.2 mmHg for SBP and −0.7 and 2.6 mmHg for DBP, resp., from the traditional auscultatory and visual auscultation methods). In conclusion, the visual auscultation method had the ability to achieve an acceptable degree of BP measurement accuracy, with smaller variability in comparison with the traditional auscultatory method. PMID:29423405

  18. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oesterling, Patrick; Heine, Christian; Weber, Gunther H.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phasemore » utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.« less

  19. Implant and prosthesis movement after enucleation: a randomized controlled trial.

    PubMed

    Shome, Debraj; Honavar, Santosh G; Raizada, Kuldeep; Raizada, Deepa

    2010-08-01

    To evaluate implant and prosthesis movement after myoconjunctival enucleation and subsequent polymethyl methacrylate (PMMA) implantation, compared with the traditional enucleation with muscle imbrication using a PMMA implant and with enucleation accompanied by porous polyethylene implantation. Randomized, controlled, observer-masked, interventional study. One hundred fifty patients, equally and randomly allocated to the 3 groups. Group 1 consisted of patients in whom a PMMA implant was used after enucleation with muscle imbrication (traditional PMMA group). Group 2 consisted of patients in whom a PMMA implant was used after enucleation with a myoconjunctival technique (myoconjunctival PMMA group). Group 3 consisted of patients in whom a porous polyethylene implant was used after enucleation by the scleral cap technique (porous polyethylene group). Fifty patients were included in each group. Patients were allocated to 1 of the 3 groups using stratified randomization. Informed consent was obtained. Acrylic prostheses custom made by a trained ocularist were fitted 6 weeks after surgery in all patients. A masked observer measured implant and prosthesis movement 6 weeks after surgery using a slit-lamp device with real-time video and still photographic documentation. Analysis of implant and prosthesis movement was carried out using the Mann-Whitney U test, and a P value of < or =0.03 was considered significant. Complications including implant displacement and exposure also were noted. Implant and prosthesis movement. Myoconjunctival PMMA implant movement was better than the traditional PMMA implant (P = 0.001), but was similar to that of the porous polyethylene implant. Prosthesis movement with the myoconjunctival PMMA implant was better than that of either the traditional PMMA (P = 0.001) or porous polyethylene (P = 0.002) implants. Myoconjunctival enucleation technique with a PMMA implant provides statistically and clinically significantly better implant and prosthesis movement than the traditional PMMA implant and better prosthesis movement than the porous polyethylene implant. Copyright 2010 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.

  20. Global-Local Finite Element Analysis of Bonded Single-Lap Joints

    NASA Technical Reports Server (NTRS)

    Kilic, Bahattin; Madenci, Erdogan; Ambur, Damodar R.

    2004-01-01

    Adhesively bonded lap joints involve dissimilar material junctions and sharp changes in geometry, possibly leading to premature failure. Although the finite element method is well suited to model the bonded lap joints, traditional finite elements are incapable of correctly resolving the stress state at junctions of dissimilar materials because of the unbounded nature of the stresses. In order to facilitate the use of bonded lap joints in future structures, this study presents a finite element technique utilizing a global (special) element coupled with traditional elements. The global element includes the singular behavior at the junction of dissimilar materials with or without traction-free surfaces.

  1. Use of Iba Techniques to Characterize High Velocity Thermal Spray Coatings

    NASA Astrophysics Data System (ADS)

    Trompetter, W.; Markwitz, A.; Hyland, M.

    Spray coatings are being used in an increasingly wide range of industries to improve the abrasive, erosive and sliding wear of machine components. Over the past decade industries have moved to the application of supersonic high velocity thermal spray techniques. These coating techniques produce superior coating quality in comparison to other traditional techniques such as plasma spraying. To date the knowledge of the bonding processes and the structure of the particles within thermal spray coatings is very subjective. The aim of this research is to improve our understanding of these materials through the use of IBA techniques in conjunction with other materials analysis techniques. Samples were prepared by spraying a widely used commercial NiCr powder onto substrates using a HVAF (high velocity air fuel) thermal spraying technique. Detailed analysis of the composition and structure of the power particles revealed two distinct types of particles. The majority was NiCr particles with a significant minority of particles composing of SiO2/CrO3. When the particles were investigated both as raw powder and in the sprayed coating, it was surprising to find that the composition of the coating meterial remained unchanged during the coating process despite the high velocity application.

  2. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Handling Missing Data With Multilevel Structural Equation Modeling and Full Information Maximum Likelihood Techniques.

    PubMed

    Schminkey, Donna L; von Oertzen, Timo; Bullock, Linda

    2016-08-01

    With increasing access to population-based data and electronic health records for secondary analysis, missing data are common. In the social and behavioral sciences, missing data frequently are handled with multiple imputation methods or full information maximum likelihood (FIML) techniques, but healthcare researchers have not embraced these methodologies to the same extent and more often use either traditional imputation techniques or complete case analysis, which can compromise power and introduce unintended bias. This article is a review of options for handling missing data, concluding with a case study demonstrating the utility of multilevel structural equation modeling using full information maximum likelihood (MSEM with FIML) to handle large amounts of missing data. MSEM with FIML is a parsimonious and hypothesis-driven strategy to cope with large amounts of missing data without compromising power or introducing bias. This technique is relevant for nurse researchers faced with ever-increasing amounts of electronic data and decreasing research budgets. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. Principles, Techniques, and Applications of Tissue Microfluidics

    NASA Technical Reports Server (NTRS)

    Wade, Lawrence A.; Kartalov, Emil P.; Shibata, Darryl; Taylor, Clive

    2011-01-01

    The principle of tissue microfluidics and its resultant techniques has been applied to cell analysis. Building microfluidics to suit a particular tissue sample would allow the rapid, reliable, inexpensive, highly parallelized, selective extraction of chosen regions of tissue for purposes of further biochemical analysis. Furthermore, the applicability of the techniques ranges beyond the described pathology application. For example, they would also allow the posing and successful answering of new sets of questions in many areas of fundamental research. The proposed integration of microfluidic techniques and tissue slice samples is called "tissue microfluidics" because it molds the microfluidic architectures in accordance with each particular structure of each specific tissue sample. Thus, microfluidics can be built around the tissues, following the tissue structure, or alternatively, the microfluidics can be adapted to the specific geometry of particular tissues. By contrast, the traditional approach is that microfluidic devices are structured in accordance with engineering considerations, while the biological components in applied devices are forced to comply with these engineering presets.

  5. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Traditional microbiological techniques for estimating populations of viable bacteria can be laborious and time consuming. The Most Probable Number (MPN) technique is especially tedious as multiple series of tubes must be inoculated at several different dilutions. Recently, an instrument (TEMPOTM) ...

  6. [Water quality and personal hygiene in rural areas of Senegal].

    PubMed

    Faye, A; Ndiaye, N M; Faye, D; Tal-Dia, A

    2011-02-01

    The high prevalence of diarrhea in developing countries is mostly due to poor water quality and hygiene practices. The purpose of this study was to assess water quality as well as hygiene practices and their determinants in Ngohé, i.e., a rural community (RC) in Senegal. A combined approach consisting of a cross-sectional descriptive survey and bacterial analysis of water was used. Study was conducted in 312 randomly selected households. Data was collected through individual interviews with the assistance of a guide. Water for bacteriological analysis was collected from various sources, i.e., 3 modem borehole wells, 2 protected wells, and 10 traditional wells. Study points included home water treatment, drinking water source, latrine use, hand washing habits, and bacteria identified in water. A multiple regression model was used for data analysis. The household survey population was 59% male, 61% illiterate, and 93% married. Mean age was 44.8 +/- 18.1 years. Chlorination technique was inadequate in 62% of cases. Latrines were not restricted to adult use in 76% of homes. Hand washing was not performed at critical times in 94%. Drinking water was drawn from traditional wells in 48% of households, modem borehole wells in 45% and protected wells in 7%. Escherichia coli was found in water from all three sources and Vibrio cholerae was found in two traditional wells. Level of education, average monthly income, knowledge about chlorination techniques, and source of the water consumed were the main behavioral determinants (p < 0.05). Water treatment at the source and in the home as well as protection of water sources is necessary to ensure water quality. This will require effective public education campaigns and financial support for improvement of sanitary facilities.

  7. Analysis of Selected Aspects of Students' Performance and Satisfaction in a Moodle-Based E-Learning System Environment

    ERIC Educational Resources Information Center

    Umek, Lan; Aristovnik, Aleksander; Tomaževic, Nina; Keržic, Damijana

    2015-01-01

    The use of e-learning techniques in higher education is becoming ever more frequent. In some institutions, e-learning has completely replaced the traditional teaching methods, while in others it supplements classical courses. The paper presents a study conducted in a member institution of the University of Ljubljana that provides public…

  8. Mining Student Data Captured from a Web-Based Tutoring Tool: Initial Exploration and Results

    ERIC Educational Resources Information Center

    Merceron, Agathe; Yacef, Kalina

    2004-01-01

    In this article we describe the initial investigations that we have conducted on student data collected from a web-based tutoring tool. We have used some data mining techniques such as association rule and symbolic data analysis, as well as traditional SQL queries to gain further insight on the students' learning and deduce information to improve…

  9. To Flip or Not to Flip? Analysis of a Flipped Classroom Pedagogy in a General Biology Course

    ERIC Educational Resources Information Center

    Heyborne, William H.; Perrett, Jamis J.

    2016-01-01

    In an attempt to better understand the flipped technique and evaluate its purported superiority in terms of student learning gains, the authors conducted an experiment comparing a flipped classroom to a traditional lecture classroom. Although the outcomes were mixed, regarding the superiority of either pedagogical approach, there does seem to be a…

  10. Monitoring urban tree cover using object-based image analysis and public domain remotely sensed data

    Treesearch

    L. Monika Moskal; Diane M. Styers; Meghan Halabisky

    2011-01-01

    Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes....

  11. An Incremental Life-cycle Assurance Strategy for Critical System Certification

    DTIC Science & Technology

    2014-11-04

    for Safe Aircraft Operation Embedded software systems introduce a new class of problems not addressed by traditional system modeling & analysis...Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects control behavior...do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of

  12. Comparative study between 2 methods of mounting models in semiadjustable articulator for orthognathic surgery.

    PubMed

    Mayrink, Gabriela; Sawazaki, Renato; Asprino, Luciana; de Moraes, Márcio; Fernandes Moreira, Roger William

    2011-11-01

    Compare the traditional method of mounting dental casts on a semiadjustable articulator and the new method suggested by Wolford and Galiano, 1 analyzing the inclination of maxillary occlusal plane in relation to FHP. Two casts of 10 patients were obtained. One of them was used for mounting of models on a traditional articulator, by using a face bow transfer system and the other one was used to mounting models at Occlusal Plane Indicator platform (OPI), using the SAM articulator. After that, na analysis of the accuracy of mounting models was performed. The angle made by de occlusal plane and FHP on the cephalogram should be equal the angle between the occlusal plane and the upper member of the articulator. The measures were tabulated in Microsoft Excell(®) and calculated using a 1-way analysis variance. Statistically, the results did not reveal significant differences among the measures. OPI and face bow presents similar results but more studies are needed to verify its accuracy relative to the maxillary cant in OPI or develop new techniques able to solve the disadvantages of each technique. Copyright © 2011 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  13. Gas Chromatography Analysis with Olfactometric Detection (GC-O) as a Useful Methodology for Chemical Characterization of Odorous Compounds

    PubMed Central

    Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria

    2013-01-01

    The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications. PMID:24316571

  14. In-situ spectroscopic analysis of the traditional dyeing pigment Turkey red inside textile matrix

    NASA Astrophysics Data System (ADS)

    Meyer, M.; Huthwelker, T.; Borca, C. N.; Meßlinger, K.; Bieber, M.; Fink, R. H.; Späth, A.

    2018-03-01

    Turkey red is a traditional pigment for textile dyeing and its use has been proven for various cultures within the last three millennia. The pigment is a dye-mordant complex consisting of Al and an extract from R. tinctorum that contains mainly the anthraquinone derivative alizarin. The chemical structure of the complex has been analyzed by various spectroscopic and crystallographic techniques for extractions from textiles or directly in solution. We present an in-situ study of Turkey red by means of μ-XRF mapping and NEXAFS spectroscopy on textile fibres dyed according to a traditional process to gain insight into the coordination chemistry of the pigment in realistic matrix. We find an octahedral coordination of Al that corresponds well to the commonly accepted structure of the Al alizarin complex derived from ex-situ studies.

  15. Trailing Ballute Aerocapture: Concept and Feasibility Assessment

    NASA Technical Reports Server (NTRS)

    Miller, Kevin L.; Gulick, Doug; Lewis, Jake; Trochman, Bill; Stein, Jim; Lyons, Daniel T.; Wilmoth, Richard G.

    2003-01-01

    Trailing Ballute Aerocapture offers the potential to obtain orbit insertion around a planetary body at a fraction of the mass of traditional methods. This allows for lower costs for launch, faster flight times and additional mass available for science payloads. The technique involves an inflated ballute (balloon-parachute) that provides aerodynamic drag area for use in the atmosphere of a planetary body to provide for orbit insertion in a relatively benign heating environment. To account for atmospheric, navigation and other uncertainties, the ballute is oversized and detached once the desired velocity change (Delta V) has been achieved. Analysis and trades have been performed for the purpose of assessing the feasibility of the technique including aerophysics, material assessments, inflation system and deployment sequence and dynamics, configuration trades, ballute separation and trajectory analysis. Outlined is the technology development required for advancing the technique to a level that would allow it to be viable for use in space exploration missions.

  16. Imaging challenges in biomaterials and tissue engineering

    PubMed Central

    Appel, Alyssa A.; Anastasio, Mark A.; Larson, Jeffery C.; Brey, Eric M.

    2013-01-01

    Biomaterials are employed in the fields of tissue engineering and regenerative medicine (TERM) in order to enhance the regeneration or replacement of tissue function and/or structure. The unique environments resulting from the presence of biomaterials, cells, and tissues result in distinct challenges in regards to monitoring and assessing the results of these interventions. Imaging technologies for three-dimensional (3D) analysis have been identified as a strategic priority in TERM research. Traditionally, histological and immunohistochemical techniques have been used to evaluate engineered tissues. However, these methods do not allow for an accurate volume assessment, are invasive, and do not provide information on functional status. Imaging techniques are needed that enable non-destructive, longitudinal, quantitative, and three-dimensional analysis of TERM strategies. This review focuses on evaluating the application of available imaging modalities for assessment of biomaterials and tissue in TERM applications. Included is a discussion of limitations of these techniques and identification of areas for further development. PMID:23768903

  17. The effects of different representations on static structure analysis of computer malware signatures.

    PubMed

    Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban

    2013-01-01

    The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka.

  18. The Effects of Different Representations on Static Structure Analysis of Computer Malware Signatures

    PubMed Central

    Narayanan, Ajit; Chen, Yi; Pang, Shaoning; Tao, Ban

    2013-01-01

    The continuous growth of malware presents a problem for internet computing due to increasingly sophisticated techniques for disguising malicious code through mutation and the time required to identify signatures for use by antiviral software systems (AVS). Malware modelling has focused primarily on semantics due to the intended actions and behaviours of viral and worm code. The aim of this paper is to evaluate a static structure approach to malware modelling using the growing malware signature databases now available. We show that, if malware signatures are represented as artificial protein sequences, it is possible to apply standard sequence alignment techniques in bioinformatics to improve accuracy of distinguishing between worm and virus signatures. Moreover, aligned signature sequences can be mined through traditional data mining techniques to extract metasignatures that help to distinguish between viral and worm signatures. All bioinformatics and data mining analysis were performed on publicly available tools and Weka. PMID:23983644

  19. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    PubMed

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  20. CRISPR/Cas9 and genome editing in Drosophila.

    PubMed

    Bassett, Andrew R; Liu, Ji-Long

    2014-01-20

    Recent advances in our ability to design DNA binding factors with specificity for desired sequences have resulted in a revolution in genetic engineering, enabling directed changes to the genome to be made relatively easily. Traditional techniques for generating genetic mutations in most organisms have relied on selection from large pools of randomly induced mutations for those of particular interest, or time-consuming gene targeting by homologous recombination. Drosophila melanogaster has always been at the forefront of genetic analysis, and application of these new genome editing techniques to this organism will revolutionise our approach to performing analysis of gene function in the future. We discuss the recent techniques that apply the CRISPR/Cas9 system to Drosophila, highlight potential uses for this technology and speculate upon the future of genome engineering in this model organism. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Histology image analysis for carcinoma detection and grading

    PubMed Central

    He, Lei; Long, L. Rodney; Antani, Sameer; Thoma, George R.

    2012-01-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. PMID:22436890

  2. Automated measurement of birefringence - Development and experimental evaluation of the techniques

    NASA Technical Reports Server (NTRS)

    Voloshin, A. S.; Redner, A. S.

    1989-01-01

    Traditional photoelasticity has started to lose its appeal since it requires a well-trained specialist to acquire and interpret results. A spectral-contents-analysis approach may help to revive this old, but still useful technique. Light intensity of the beam passed through the stressed specimen contains all the information necessary to automatically extract the value of retardation. This is done by using a photodiode array to investigate the spectral contents of the light beam. Three different techniques to extract the value of retardation from the spectral contents of the light are discussed and evaluated. An experimental system was built which demonstrates the ability to evaluate retardation values in real time.

  3. Headspace techniques in foods, fragrances and flavors: an overview.

    PubMed

    Rouseff, R; Cadwallader, K

    2001-01-01

    Headspace techniques have traditionally involved the collection of volatiles in the vapor state under either dynamic or static conditions as a means of determining concentrations in the product of interest. A brief overview of contemporary headspace applications and recent innovations are presented from the literature and Chapters in this book. New approaches used to concentrate volatiles under static conditions such as solid phase micro extraction, SPME, are examined. Advances in purge and trap applications and automation are also presented. Innovative methods of evaluating headspace volatiles using solid state sensor arrays (electronic noses) or mass spectrometers without prior separation are referenced. Numerous food and beverage headspace techniques are also reviewed. Advantages, limitations and alternatives to headspace analysis are presented.

  4. Decorin content and near infrared spectroscopy analysis of dried collagenous biomaterial samples.

    PubMed

    Aldema-Ramos, Mila L; Castell, Joan Carles; Muir, Zerlina E; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-12-14

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis.

  5. Mini-open lateral retroperitoneal lumbar spine approach using psoas muscle retraction technique. Technical report and initial results on six patients.

    PubMed

    Aghayev, Kamran; Vrionis, Frank D

    2013-09-01

    The main aim of this paper was to report reproducible method of lumbar spine access via a lateral retroperitoneal route. The authors conducted a retrospective analysis of the technical aspects and clinical outcomes of six patients who underwent lateral multilevel retroperitoneal interbody fusion with psoas muscle retraction technique. The main goal was to develop a simple and reproducible technique to avoid injury to the lumbar plexus. Six patients were operated at 15 levels using psoas muscle retraction technique. All patients reported improvement in back pain and radiculopathy after the surgery. The only procedure-related transient complication was weakness and pain on hip flexion that resolved by the first follow-up visit. Psoas retraction technique is a reliable technique for lateral access to the lumbar spine and may avoid some of the complications related to traditional minimally invasive transpsoas approach.

  6. Segmental Refinement: A Multigrid Technique for Data Locality

    DOE PAGES

    Adams, Mark F.; Brown, Jed; Knepley, Matt; ...

    2016-08-04

    In this paper, we investigate a domain decomposed multigrid technique, termed segmental refinement, for solving general nonlinear elliptic boundary value problems. We extend the method first proposed in 1994 by analytically and experimentally investigating its complexity. We confirm that communication of traditional parallel multigrid is eliminated on fine grids, with modest amounts of extra work and storage, while maintaining the asymptotic exactness of full multigrid. We observe an accuracy dependence on the segmental refinement subdomain size, which was not considered in the original analysis. Finally, we present a communication complexity analysis that quantifies the communication costs ameliorated by segmental refinementmore » and report performance results with up to 64K cores on a Cray XC30.« less

  7. Meteor tracking via local pattern clustering in spatio-temporal domain

    NASA Astrophysics Data System (ADS)

    Kukal, Jaromír.; Klimt, Martin; Švihlík, Jan; Fliegel, Karel

    2016-09-01

    Reliable meteor detection is one of the crucial disciplines in astronomy. A variety of imaging systems is used for meteor path reconstruction. The traditional approach is based on analysis of 2D image sequences obtained from a double station video observation system. Precise localization of meteor path is difficult due to atmospheric turbulence and other factors causing spatio-temporal fluctuations of the image background. The proposed technique performs non-linear preprocessing of image intensity using Box-Cox transform as recommended in our previous work. Both symmetric and asymmetric spatio-temporal differences are designed to be robust in the statistical sense. Resulting local patterns are processed by data whitening technique and obtained vectors are classified via cluster analysis and Self-Organized Map (SOM).

  8. Application of Deep Learning in Automated Analysis of Molecular Images in Cancer: A Survey

    PubMed Central

    Xue, Yong; Chen, Shihui; Liu, Yong

    2017-01-01

    Molecular imaging enables the visualization and quantitative analysis of the alterations of biological procedures at molecular and/or cellular level, which is of great significance for early detection of cancer. In recent years, deep leaning has been widely used in medical imaging analysis, as it overcomes the limitations of visual assessment and traditional machine learning techniques by extracting hierarchical features with powerful representation capability. Research on cancer molecular images using deep learning techniques is also increasing dynamically. Hence, in this paper, we review the applications of deep learning in molecular imaging in terms of tumor lesion segmentation, tumor classification, and survival prediction. We also outline some future directions in which researchers may develop more powerful deep learning models for better performance in the applications in cancer molecular imaging. PMID:29114182

  9. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Z.; Gilli, L.; Lathouwers, D.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used techniquemore » proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)« less

  10. Acetic acid bacteria from biofilm of strawberry vinegar visualized by microscopy and detected by complementing culture-dependent and culture-independent techniques.

    PubMed

    Valera, Maria José; Torija, Maria Jesús; Mas, Albert; Mateo, Estibaliz

    2015-04-01

    Acetic acid bacteria (AAB) usually develop biofilm on the air-liquid interface of the vinegar elaborated by traditional method. This is the first study in which the AAB microbiota present in a biofilm of vinegar obtained by traditional method was detected by pyrosequencing. Direct genomic DNA extraction from biofilm was set up to obtain suitable quality of DNA to apply in culture-independent molecular techniques. The set of primers and TaqMan--MGB probe designed in this study to enumerate the total AAB population by Real Time--PCR detected between 8 × 10(5) and 1.2 × 10(6) cells/g in the biofilm. Pyrosequencing approach reached up to 10 AAB genera identification. The combination of culture-dependent and culture-independent molecular techniques provided a broader view of AAB microbiota from the strawberry biofilm, which was dominated by Ameyamaea, Gluconacetobacter, and Komagataeibacter genera. Culture-dependent techniques allowed isolating only one genotype, which was assigned into the Ameyamaea genus and which required more analysis for a correct species identification. Furthermore, biofilm visualization by laser confocal microscope and scanning electronic microscope showed different dispositions and cell morphologies in the strawberry vinegar biofilm compared with a grape vinegar biofilm. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Assessment of three-dimensional high-definition visualization technology to perform microvascular anastomosis.

    PubMed

    Wong, Alex K; Davis, Gabrielle B; Nguyen, T JoAnna; Hui, Kenneth J W S; Hwang, Brian H; Chan, Linda S; Zhou, Zhao; Schooler, Wesley G; Chandrasekhar, Bala S; Urata, Mark M

    2014-07-01

    Traditional visualization techniques in microsurgery require strict positioning in order to maintain the field of visualization. However, static posturing over time may lead to musculoskeletal strain and injury. Three-dimensional high-definition (3DHD) visualization technology may be a useful adjunct to limiting static posturing and improving ergonomics in microsurgery. In this study, we aimed to investigate the benefits of using the 3DHD technology over traditional techniques. A total of 14 volunteers consisting of novice and experienced microsurgeons performed femoral anastomoses on male Sprague-Dawley retired breeder rats using traditional techniques as well as the 3DHD technology and compared the two techniques. Participants subsequently completed a questionnaire regarding their preference in terms of operational parameters, ergonomics, overall quality, and educational benefits. Efficiency was also evaluated by mean times to complete the anastomosis with each technique. A total of 27 anastomoses were performed, 14 of 14 using the traditional microscope and 13 of 14 using the 3DHD technology. Preference toward the traditional modality was noted with respect to the parameters of precision, field adjustments, zoom and focus, depth perception, and overall quality. The 3DHD technique was preferred for improved stamina and less back and eye strain. Participants believed that the 3DHD technique was the better method for learning microsurgery. Longer mean time of anastomosis completion was noted in participants utilizing the 3DHD technique. The 3DHD technology may prove to be valuable in improving proper ergonomics in microsurgery. In addition, it may be useful in medical education when applied to the learning of new microsurgical skills. More studies are warranted to determine its efficacy and safety in a clinical setting. Copyright © 2014 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. Alternative and traditional assessments: Their comparative impact on students' attitudes and science learning outcomes. An exploratory study

    NASA Astrophysics Data System (ADS)

    Century, Daisy Nelson

    This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.

  13. DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine.

    PubMed

    Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li; Li, Haifeng

    2018-01-01

    Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine.

  14. DNA Barcoding for the Identification and Authentication of Animal Species in Traditional Medicine

    PubMed Central

    Yang, Fan; Ding, Fei; Chen, Hong; He, Mingqi; Zhu, Shixin; Ma, Xin; Jiang, Li

    2018-01-01

    Animal-based traditional medicine not only plays a significant role in therapeutic practices worldwide but also provides a potential compound library for drug discovery. However, persistent hunting and illegal trade markedly threaten numerous medicinal animal species, and increasing demand further provokes the emergence of various adulterants. As the conventional methods are difficult and time-consuming to detect processed products or identify animal species with similar morphology, developing novel authentication methods for animal-based traditional medicine represents an urgent need. During the last decade, DNA barcoding offers an accurate and efficient strategy that can identify existing species and discover unknown species via analysis of sequence variation in a standardized region of DNA. Recent studies have shown that DNA barcoding as well as minibarcoding and metabarcoding is capable of identifying animal species and discriminating the authentics from the adulterants in various types of traditional medicines, including raw materials, processed products, and complex preparations. These techniques can also be used to detect the unlabelled and threatened animal species in traditional medicine. Here, we review the recent progress of DNA barcoding for the identification and authentication of animal species used in traditional medicine, which provides a reference for quality control and trade supervision of animal-based traditional medicine. PMID:29849709

  15. Soy-enhanced lunch acceptance by preschoolers.

    PubMed

    Endres, Jeannette; Barter, Sharon; Theodora, Perseli; Welch, Patricia

    2003-03-01

    To evaluate acceptance of soy-enhanced compared with traditional menus by preschool children. Soy-enhanced foods were substituted on a traditional cycle menu, and the amount eaten, energy, and nutrient values for traditional and soy-enhanced lunches were compared. A traditional three-week cycle menu, using the Child and Adult Care Food Program (CACFP) meal pattern guidelines, was used to develop a comparable soy-enhanced menu. Traditional and soy-enhanced lunches were randomly assigned to respective days. Foods were portioned onto individual plates using standardized measuring utensils. Individual plate waste techniques were used to collect food waste. Subjects/setting Participants were preschool children, three to six years of age and of white and Hispanic origin, attending a part-day Head Start program. Statistical analyses performed Analysis of covariance was used to adjust lunch and food intakes for differences in average amounts of foods served. The Nutrient Data System was used to calculate energy and nutrient content of lunches. Analysis of variance was used to calculate differences in amounts eaten, energy values, and nutrient values of traditional and soy-enhanced lunches and foods. Data analyses were performed with the Statistical Analysis Software (version 8.0, 1999, SAS Institute, Cary, NC). Soy-enhanced foods were successfully substituted for 23 traditional foods included in the cycle menus. Soy-enhanced foods tended to be higher in energy, protein, and iron. Traditional lunches tended to be higher in fat, saturated fat, and vitamin A. Consumption was significantly less for energy, protein, fiber, and iron from foods eaten from traditional compared with soy-enhanced lunch menus. Applications/conclusions Acceptance of soy-enhanced lunches was shown because there were no significant differences in the average amount eaten (grams per meal) between traditional and soy-enhanced lunches. Preschool programs can substitute soy-enhanced for traditional foods, which will add variety to the diet without sacrificing taste, energy, or nutrient value. The fat and energy content of the lunches was higher than recommended, and soy-enhanced foods were not always lower in fat. There is a need for the food industry and foodservice personnel to address the energy and fat content of all foods served in lunches to preschool children because a few extra calories added to the daily intakes can contribute to weight gain.

  16. Use Hierarchical Storage and Analysis to Exploit Intrinsic Parallelism

    NASA Astrophysics Data System (ADS)

    Zender, C. S.; Wang, W.; Vicente, P.

    2013-12-01

    Big Data is an ugly name for the scientific opportunities and challenges created by the growing wealth of geoscience data. How to weave large, disparate datasets together to best reveal their underlying properties, to exploit their strengths and minimize their weaknesses, to continually aggregate more information than the world knew yesterday and less than we will learn tomorrow? Data analytics techniques (statistics, data mining, machine learning, etc.) can accelerate pattern recognition and discovery. However, often researchers must, prior to analysis, organize multiple related datasets into a coherent framework. Hierarchical organization permits entire dataset to be stored in nested groups that reflect their intrinsic relationships and similarities. Hierarchical data can be simpler and faster to analyze by coding operators to automatically parallelize processes over isomorphic storage units, i.e., groups. The newest generation of netCDF Operators (NCO) embody this hierarchical approach, while still supporting traditional analysis approaches. We will use NCO to demonstrate the trade-offs involved in processing a prototypical Big Data application (analysis of CMIP5 datasets) using hierarchical and traditional analysis approaches.

  17. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis

    PubMed Central

    Miller, Matthew James; McGuire, Kerry M.; Feigh, Karen M.

    2016-01-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity. The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design. PMID:28491008

  18. Decision Support System Requirements Definition for Human Extravehicular Activity Based on Cognitive Work Analysis.

    PubMed

    Miller, Matthew James; McGuire, Kerry M; Feigh, Karen M

    2017-06-01

    The design and adoption of decision support systems within complex work domains is a challenge for cognitive systems engineering (CSE) practitioners, particularly at the onset of project development. This article presents an example of applying CSE techniques to derive design requirements compatible with traditional systems engineering to guide decision support system development. Specifically, it demonstrates the requirements derivation process based on cognitive work analysis for a subset of human spaceflight operations known as extravehicular activity . The results are presented in two phases. First, a work domain analysis revealed a comprehensive set of work functions and constraints that exist in the extravehicular activity work domain. Second, a control task analysis was performed on a subset of the work functions identified by the work domain analysis to articulate the translation of subject matter states of knowledge to high-level decision support system requirements. This work emphasizes an incremental requirements specification process as a critical component of CSE analyses to better situate CSE perspectives within the early phases of traditional systems engineering design.

  19. Comparison of an automated Most Probable Number (MPN) technique to traditional plating methods for estimating populations of total aerobes, coliforms and E. coli associated with freshly processed broiler chickens

    USDA-ARS?s Scientific Manuscript database

    Recently, an instrument (TEMPOTM) has been developed to automate the Most Probable Number (MPN) technique and reduce the effort required to estimate some bacterial populations. We compared the automated MPN technique to traditional microbiological plating methods or PetrifilmTM for estimating the t...

  20. Minimizing Alteration of Posterior Tibial Slope During Opening Wedge High Tibial Osteotomy: a Protocol with Experimental Validation in Paired Cadaveric Knees

    PubMed Central

    Westermann, Robert W; DeBerardino, Thomas; Amendola, Annunziato

    2014-01-01

    Introduction The High Tibial Osteotomy (HTO) is a reliable procedure in addressing uni- compartmental arthritis with associated coronal deformities. With osteotomy of the proximal tibia, there is a risk of altering the tibial slope in the sagittal plane. Surgical techniques continue to evolve with trends towards procedure reproducibility and simplification. We evaluated a modification of the Arthrex iBalance technique in 18 paired cadaveric knees with the goals of maintaining sagittal slope, increasing procedure efficiency, and decreasing use of intraoperative fluoroscopy. Methods Nine paired cadaveric knees (18 legs) underwent iBalance medial opening wedge high tibial osteotomies. In each pair, the right knee underwent an HTO using the modified technique, while all left knees underwent the traditional technique. Independent observers evaluated postoperative factors including tibial slope, placement of hinge pin, and implant placement. Specimens were then dissected to evaluate for any gross muscle, nerve or vessel injury. Results Changes to posterior tibial slope were similar using each technique. The change in slope in traditional iBalance technique was -0.3° ±2.3° and change in tibial slope using the modified iBalance technique was -0.4° ±2.3° (p=0.29). Furthermore, we detected no differences in posterior tibial slope between preoperative and postoperative specimens (p=0.74 traditional, p=0.75 modified). No differences in implant placement were detected between traditional and modified techniques. (p=0.85). No intraoperative iatrogenic complications (i.e. lateral cortex fracture, blood vessel or nerve injury) were observed in either group after gross dissection. Discussion & Conclusions Alterations in posterior tibial slope are associated with HTOs. Both traditional and modified iBalance techniques appear reliable in coronal plane corrections without changing posterior tibial slope. The present modification of the Arthrex iBalance technique may increase the efficiency of the operation and decrease radiation exposure to patients without compromising implant placement or global knee alignment. PMID:25328454

  1. GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS – WHAT’S WHAT?

    EPA Science Inventory

    Recent studies have been conducted to evaluate different sampling techniques for determining VOC concentrations in groundwater. Samples were obtained using multi-level and traditional sampling techniques in three monitoring wells at the Raymark Superfund site in Stratford, CT. Ve...

  2. Comparative analysis of long-term outcomes of Misgav Ladach technique cesarean section and traditional cesarean section.

    PubMed

    Ghahiry, Ata; Rezaei, Farimah; Karimi Khouzani, Reza; Ashrafinia, Mansoor

    2012-10-01

    The aim of the present study was to evaluate pelvic adhesions, dehiscence and chronic pelvic pain in two groups of patients who underwent different cesarean section (CS) operations. One hundred and twelve eligible patients who met our criteria were randomly divided into two groups. Group 1 consisted of 52 women who had been operated at their first CS by Misgav Ladach technique and had now undergone a second CS. Group 2 consisted of 60 women who had been operated at their first CS by traditional (Pfannenstiel) technique and had now undergone a second CS. The two groups were compared for long-term outcomes, including adhesion, pelvic pain and wound dehiscence. The rate of adhesion in group 2 was 50% filmy type and 1.7% dense type. However, in group 1 the adhesion rate was 50% filmy and there was no dense type (P = 0.12). The location of adhesions were significantly different (P = 0.04). Dehiscence of uterine incision in the second group was seen in three patients but no dehiscence was found in the first group (P = 0.012). The rate of chronic pelvic pain in Misgav Ladach group (group 1) was 17.2% versus 35% in the traditional method (P = 0.01). The present results support the method of single layer suturing of the uterus and leaving the peritoneum intact in CS. © 2012 The Authors. Journal of Obstetrics and Gynaecology Research © 2012 Japan Society of Obstetrics and Gynecology.

  3. Comparison of repair techniques in small and medium-sized rotator cuff tears in cadaveric sheep shoulders.

    PubMed

    Onay, Ulaş; Akpınar, Sercan; Akgün, Rahmi Can; Balçık, Cenk; Tuncay, Ismail Cengiz

    2013-01-01

    The aim of this study was to compare new knotless single-row and double-row suture anchor techniques with traditional transosseous suture techniques for different sized rotator cuff tears in an animal model. The study included 56 cadaveric sheep shoulders. Supraspinatus cuff tears of 1 cm repaired with new knotless single-row suture anchor technique and supraspinatus and infraspinatus rotator cuff tears of 3 cm repaired with double-row suture anchor technique were compared to traditional transosseous suture techniques and control groups. The repaired tendons were loaded with 5 mm/min static velocity with 2.5 kgN load cell in Instron 8874 machine until the repair failure. The 1 cm transosseous group was statistically superior to 1 cm control group (p=0.021, p<0.05) and the 3 cm SpeedBridge group was statistically superior to the 1 cm SpeedFix group (p=0.012, p<0.05). The differences between the other groups were not statistically significant. No significant difference was found between the new knotless suture anchor techniques and traditional transosseous suture techniques.

  4. Radiation dose reduction using a neck detection algorithm for single spiral brain and cervical spine CT acquisition in the trauma setting.

    PubMed

    Ardley, Nicholas D; Lau, Ken K; Buchan, Kevin

    2013-12-01

    Cervical spine injuries occur in 4-8 % of adults with head trauma. Dual acquisition technique has been traditionally used for the CT scanning of brain and cervical spine. The purpose of this study was to determine the efficacy of radiation dose reduction by using a single acquisition technique that incorporated both anatomical regions with a dedicated neck detection algorithm. Thirty trauma patients for brain and cervical spine CT were included and were scanned with the single acquisition technique. The radiation doses from the single CT acquisition technique with the neck detection algorithm, which allowed appropriate independent dose administration relevant to brain and cervical spine regions, were recorded. Comparison was made both to the doses calculated from the simulation of the traditional dual acquisitions with matching parameters, and to the doses of retrospective dual acquisition legacy technique with the same sample size. The mean simulated dose for the traditional dual acquisition technique was 3.99 mSv, comparable to the average dose of 4.2 mSv from 30 previous patients who had CT of brain and cervical spine as dual acquisitions. The mean dose from the single acquisition technique was 3.35 mSv, resulting in a 16 % overall dose reduction. The images from the single acquisition technique were of excellent diagnostic quality. The new single acquisition CT technique incorporating the neck detection algorithm for brain and cervical spine significantly reduces the overall radiation dose by eliminating the unavoidable overlapping range between 2 anatomical regions which occurs with the traditional dual acquisition technique.

  5. Coronal Axis Measurement of the Optic Nerve Sheath Diameter Using a Linear Transducer.

    PubMed

    Amini, Richard; Stolz, Lori A; Patanwala, Asad E; Adhikari, Srikar

    2015-09-01

    The true optic nerve sheath diameter cutoff value for detecting elevated intracranial pressure is variable. The variability may stem from the technique used to acquire sonographic measurements of the optic nerve sheath diameter as well as sonographic artifacts inherent to the technique. The purpose of this study was to compare the traditional visual axis technique to an infraorbital coronal axis technique for assessing the optic nerve sheath diameter using a high-frequency linear array transducer. We conducted a cross-sectional study at an academic medical center. Timed optic nerve sheath diameter measurements were obtained on both eyes of healthy adult volunteers with a 10-5-MHz broadband linear array transducer using both traditional visual axis and coronal axis techniques. Optic nerve sheath diameter measurements were obtained by 2 sonologists who graded the difficulty of each technique and were blinded to each other's measurements for each participant. A total of 42 volunteers were enrolled, yielding 84 optic nerve sheath diameter measurements. There were no significant differences in the measurements between the techniques on either eye (P = .23 [right]; P = .99 [left]). Additionally, there was no difference in the degree of difficulty obtaining the measurements between the techniques (P = .16). There was a statistically significant difference in the time required to obtain the measurements between the traditional and coronal techniques (P < .05). Infraorbital coronal axis measurements are similar to measurements obtained in the traditional visual axis. The infraorbital coronal axis technique is slightly faster to perform and is not technically challenging. © 2015 by the American Institute of Ultrasound in Medicine.

  6. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  7. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  8. Computational intelligence techniques for biological data mining: An overview

    NASA Astrophysics Data System (ADS)

    Faye, Ibrahima; Iqbal, Muhammad Javed; Said, Abas Md; Samir, Brahim Belhaouari

    2014-10-01

    Computational techniques have been successfully utilized for a highly accurate analysis and modeling of multifaceted and raw biological data gathered from various genome sequencing projects. These techniques are proving much more effective to overcome the limitations of the traditional in-vitro experiments on the constantly increasing sequence data. However, most critical problems that caught the attention of the researchers may include, but not limited to these: accurate structure and function prediction of unknown proteins, protein subcellular localization prediction, finding protein-protein interactions, protein fold recognition, analysis of microarray gene expression data, etc. To solve these problems, various classification and clustering techniques using machine learning have been extensively used in the published literature. These techniques include neural network algorithms, genetic algorithms, fuzzy ARTMAP, K-Means, K-NN, SVM, Rough set classifiers, decision tree and HMM based algorithms. Major difficulties in applying the above algorithms include the limitations found in the previous feature encoding and selection methods while extracting the best features, increasing classification accuracy and decreasing the running time overheads of the learning algorithms. The application of this research would be potentially useful in the drug design and in the diagnosis of some diseases. This paper presents a concise overview of the well-known protein classification techniques.

  9. On the Use of Statistics in Design and the Implications for Deterministic Computer Experiments

    NASA Technical Reports Server (NTRS)

    Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.

    1997-01-01

    Perhaps the most prevalent use of statistics in engineering design is through Taguchi's parameter and robust design -- using orthogonal arrays to compute signal-to-noise ratios in a process of design improvement. In our view, however, there is an equally exciting use of statistics in design that could become just as prevalent: it is the concept of metamodeling whereby statistical models are built to approximate detailed computer analysis codes. Although computers continue to get faster, analysis codes always seem to keep pace so that their computational time remains non-trivial. Through metamodeling, approximations of these codes are built that are orders of magnitude cheaper to run. These metamodels can then be linked to optimization routines for fast analysis, or they can serve as a bridge for integrating analysis codes across different domains. In this paper we first review metamodeling techniques that encompass design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We discuss their existing applications in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of metamodeling techniques in given situations and how common pitfalls can be avoided.

  10. Sediment-generated noise (SGN): Comparison with physical bedload measurements in a small semi-arid watershed

    USDA-ARS?s Scientific Manuscript database

    Passive acoustic techniques for the measurement of Sediment-Generated Noise (SGN) in gravel-bed rivers present a promising alternative to traditional bedload measurement techniques. Where traditional methods are often prohibitively costly, particularly in labor requirements, and produce point-scale ...

  11. Nondestructive Analysis of Tumor-Associated Membrane Protein Integrating Imaging and Amplified Detection in situ Based on Dual-Labeled DNAzyme.

    PubMed

    Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi

    2018-01-01

    Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.

  12. Hypopharyngeal perforation near-miss during transesophageal echocardiography.

    PubMed

    Aviv, Jonathan E; Di Tullio, Marco R; Homma, Shunichi; Storper, Ian S; Zschommler, Anne; Ma, Guoguang; Petkova, Eva; Murphy, Mark; Desloge, Rosemary; Shaw, Gary; Benjamin, Stanley; Corwin, Steven

    2004-05-01

    The traditional blind passage of a transesophageal echocardiography probe transorally through the hypopharynx is considered safe. Yet, severe hypopharyngeal complications during transesophageal echocardiography at several institutions led the authors to investigate whether traditional probe passage results in a greater incidence of hypopharyngeal injuries when compared with probe passage under direct visualization. Randomized, prospective clinical study. In 159 consciously sedated adults referred for transesophageal echocardiography, the authors performed transesophageal echocardiography with concomitant transnasal videoendoscopic monitoring of the hypopharynx. Subjects were randomly assigned to receive traditional (blind) or experimental (optical) transesophageal echocardiography. The primary outcome measure was frequency of hypopharyngeal injuries (hypopharyngeal lacerations or hematomas), and the secondary outcome measure was number of hypopharyngeal contacts. No perforation occurred with either technique. However, hypopharyngeal lacerations or hematomas occurred in 19 of 80 (23.8%) patients with the traditional technique (11 superficial lacerations of pyriform sinus, 1 laceration of pharynx, 12 arytenoid hematomas, 2 vocal fold hematomas, and 1 pyriform hematoma) and in 1 of 79 patients (1.3%) with the optical technique (superficial pyriform laceration) (P =.001). All traumatized patients underwent flexible laryngoscopy, but none required additional intervention. Respectively, hypopharyngeal contacts were more frequent with the traditional than with the optical technique at the pyriform sinus (70.0% vs. 10.1% [P =.001]), arytenoid (55.0% vs. 3.8% [P =.001]), and vocal fold (15.0% vs. 3.86% [P =.016]). Optically guided trans-esophageal echocardiography results in significantly fewer hypopharyngeal injuries and fewer contacts than traditional, blind transesophageal echocardiography. The optically guided technique may result in decreased frequency of potentially significant complications and therefore in improved patient safety.

  13. Multiparty Quantum English Auction Scheme Using Single Photons as Message Carrier

    NASA Astrophysics Data System (ADS)

    Liu, Ge; Zhang, Jian-Zhong; Xie, Shu-Cui

    2018-03-01

    In this paper, a secure and economic multiparty english auction protocol using the single photons as message carrier of bids is proposed. In order to achieve unconditional security, fairness, undeniability and so on, we adopt the decoy photon checking technique and quantum encryption algorithm. Analysis result shows that our protocol satisfies all the characteristics of traditional english auction, meanwhile, it can resist malicious attacks.

  14. A Simplified Shuttle Payload Thermal Analyzer /SSPTA/ program

    NASA Technical Reports Server (NTRS)

    Bartoszek, J. T.; Huckins, B.; Coyle, M.

    1979-01-01

    A simple thermal analysis program for Space Shuttle payloads has been developed to accommodate the user who requires an easily understood but dependable analytical tool. The thermal analysis program includes several thermal subprograms traditionally employed in spacecraft thermal studies, a data management system for data generated by the subprograms, and a master program to coordinate the data files and thermal subprograms. The language and logic used to run the thermal analysis program are designed for the small user. In addition, analytical and storage techniques which conserve computer time and minimize core requirements are incorporated into the program.

  15. Single cell analysis of normal and leukemic hematopoiesis.

    PubMed

    Povinelli, Benjamin J; Rodriguez-Meira, Alba; Mead, Adam J

    2018-02-01

    The hematopoietic system is well established as a paradigm for the study of cellular hierarchies, their disruption in disease and therapeutic use in regenerative medicine. Traditional approaches to study hematopoiesis involve purification of cell populations based on a small number of surface markers. However, such population-based analysis obscures underlying heterogeneity contained within any phenotypically defined cell population. This heterogeneity can only be resolved through single cell analysis. Recent advances in single cell techniques allow analysis of the genome, transcriptome, epigenome and proteome in single cells at an unprecedented scale. The application of these new single cell methods to investigate the hematopoietic system has led to paradigm shifts in our understanding of cellular heterogeneity in hematopoiesis and how this is disrupted in disease. In this review, we summarize how single cell techniques have been applied to the analysis of hematopoietic stem/progenitor cells in normal and malignant hematopoiesis, with a particular focus on recent advances in single-cell genomics, including how these might be utilized for clinical application. Copyright © 2017. Published by Elsevier Ltd.

  16. Factors Which Influence The Fish Purchasing Decision: A study on Traditional Market in Riau Mainland

    NASA Astrophysics Data System (ADS)

    Siswati, Latifa; Putri, Asgami

    2018-05-01

    The purposes of the research are to analyze and assess the factors which influence fish purchasing by the community at Tenayan Raya district Pekanbaru.Research methodology which used is survey method, especially interview and observation technique or direct supervision on the market which located at Tenayan Raya district. Determination technique of sampling location/region is done by purposive sampling. The sampling method is done by accidental sampling. Technique analysis of factors which used using the data that derived from the respondent opinion to various fish variable. The result of this research are the factors which influence fish purchasing decision done in a traditional market which located at Tenayan Raya district are product factor, price factors, social factor and individual factor. Product factor which influences fish purchasing decision as follows: the eyelets condition, the nutrition of fresh fish, the diversity of sold fish. Price factors influence the fish purchasing decision, such as: the price of fresh fish, the convincing price and the suitability price and benefits of the fresh fish. Individual factors which influence a fish purchasing decision, such as education and income levels. Social factors which influence a fish purchasing decision, such as family, colleagues and feeding habits of fish.

  17. Optimization of segmented thermoelectric generator using Taguchi and ANOVA techniques.

    PubMed

    Kishore, Ravi Anant; Sanghadasa, Mohan; Priya, Shashank

    2017-12-01

    Recent studies have demonstrated that segmented thermoelectric generators (TEGs) can operate over large thermal gradient and thus provide better performance (reported efficiency up to 11%) as compared to traditional TEGs, comprising of single thermoelectric (TE) material. However, segmented TEGs are still in early stages of development due to the inherent complexity in their design optimization and manufacturability. In this study, we demonstrate physics based numerical techniques along with Analysis of variance (ANOVA) and Taguchi optimization method for optimizing the performance of segmented TEGs. We have considered comprehensive set of design parameters, such as geometrical dimensions of p-n legs, height of segmentation, hot-side temperature, and load resistance, in order to optimize output power and efficiency of segmented TEGs. Using the state-of-the-art TE material properties and appropriate statistical tools, we provide near-optimum TEG configuration with only 25 experiments as compared to 3125 experiments needed by the conventional optimization methods. The effect of environmental factors on the optimization of segmented TEGs is also studied. Taguchi results are validated against the results obtained using traditional full factorial optimization technique and a TEG configuration for simultaneous optimization of power and efficiency is obtained.

  18. Habits and customs of crab catchers in southern Bahia, Brazil.

    PubMed

    Firmo, Angélica M S; Tognella, Mônica M P; Tenório, Gabrielle D; Barboza, Raynner R D; Alves, Rômulo R N

    2017-08-23

    Brazilian mangrove forests are widely distributed along the coast and exploited by groups of people with customs and habits as diverse as the biology of the mangrove ecosystems. This study identifies different methods of extracting crabs that inhabit the mangrove belts; some of these activities, such as catching individual crabs by hand, are aimed at maintaining natural stocks of this species in Mucuri (south Bahia), Brazil. In the studied community, illegal hunting activities that violate Brazilian legislation limiting the use of tangle-netting in mangrove ecosystem were observed. According to our observations, fishermen, to catch individual crabs, use the tangle-netting technique seeking to increase income and are from families that have no tradition of extraction. This analysis leads us to conclude that catchers from economically marginalised social groups enter mangroves for purposes of survival rather than for purposes of subsistence, because the catching by tangle-netting is a predatory technique. Tangle-netting  technique increase caught but also increases their mortality rate. We emphasise that traditional catching methods are unique to Brazil and that manual capturing of crab should be preserved through public policies aimed at maintaining the crab population.

  19. Identification of modal parameters including unmeasured forces and transient effects

    NASA Astrophysics Data System (ADS)

    Cauberghe, B.; Guillaume, P.; Verboven, P.; Parloo, E.

    2003-08-01

    In this paper, a frequency-domain method to estimate modal parameters from short data records with known input (measured) forces and unknown input forces is presented. The method can be used for an experimental modal analysis, an operational modal analysis (output-only data) and the combination of both. A traditional experimental and operational modal analysis in the frequency domain starts respectively, from frequency response functions and spectral density functions. To estimate these functions accurately sufficient data have to be available. The technique developed in this paper estimates the modal parameters directly from the Fourier spectra of the outputs and the known input. Instead of using Hanning windows on these short data records the transient effects are estimated simultaneously with the modal parameters. The method is illustrated, tested and validated by Monte Carlo simulations and experiments. The presented method to process short data sequences leads to unbiased estimates with a small variance in comparison to the more traditional approaches.

  20. Analyzing coastal environments by means of functional data analysis

    NASA Astrophysics Data System (ADS)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  1. Autologous Fat Grafting to the Breast Using REVOLVE System to Reduce Clinical Costs.

    PubMed

    Brzezienski, Mark A; Jarrell, John A

    2016-09-01

    With the increasing popularity of fat grafting over the past decade, the techniques for harvest, processing and preparation, and transfer of the fat cells have evolved to improve efficiency and consistency. The REVOLVE System is a fat processing device used in autologous fat grafting which eliminates much of the specialized equipment as well as the labor intensive and time consuming efforts of the original Coleman technique of fat processing. This retrospective study evaluates the economics of fat grafting, comparing traditional Coleman processing to the REVOLVE System. From June 2013 through December 2013, 88 fat grafting cases by a single-surgeon were reviewed. Timed procedures using either the REVOLVE System or Coleman technique were extracted from the group. Data including fat grafting procedure time, harvested volume, harvest and recipient sites, and concurrent procedures were gathered. Cost and utilization assessments were performed comparing the economics between the groups using standard values of operating room costs provided by the study hospital. Thirty-seven patients with timed procedures were identified, 13 of which were Coleman technique patients and twenty-four (24) were REVOLVE System patients. The average rate of fat transfer was 1.77 mL/minute for the Coleman technique and 4.69 mL/minute for the REVOLVE System, which was a statistically significant difference (P < 0.0001) between the 2 groups. Cost analysis comparing the REVOLVE System and Coleman techniques demonstrates a dramatic divergence in the price per mL of transferred fat at 75 mL when using the previously calculated rates for each group. This single surgeon's experience with the REVOLVE System for fat processing establishes economic support for its use in specific high-volume fat grafting cases. Cost analysis comparing the REVOLVE System and Coleman techniques suggests that in cases of planned fat transfer of 75 mL or more, using the REVOLVE System for fat processing is more economically beneficial. This study may serve as a guide to plastic surgeons in deciding which cases might be appropriate for the use of the REVOLVE System and is the first report comparing economics of fat grafting with the traditional Coleman technique and the REVOLVE System.

  2. ALIF: A New Promising Technique for the Decomposition and Analysis of Nonlinear and Nonstationary Signals

    NASA Astrophysics Data System (ADS)

    Cicone, A.; Zhou, H.; Piersanti, M.; Materassi, M.; Spogli, L.

    2017-12-01

    Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this poster we present a new method, called Adaptive Local Iterative Filtering (ALIF). This technique, originally developed to study mono-dimensional signals, unlike any other algorithm proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the technique can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, length of the day signal, pressure measured at ground level on a global grid, radio power scintillation from GNSS signals,

  3. Pharmaceutical drug marketing strategies and tactics: a comparative analysis of attitudes held by pharmaceutical representatives and physicians.

    PubMed

    Parker, R Stephen; Pettijohn, Charles E

    2005-01-01

    A variety of promotional strategies have been used to stimulate sales of pharmaceutical drugs. Traditionally, push techniques have been the predominant means used to encourage physicians to prescribe drugs and thus increase sales. Recently, the traditional push strategy has been supplemented by a pull strategy. Direct-to-consumer advertising is increasingly used to encourage consumers to request advertised drugs from their physicians. This research compares the attitudes of two of the most affected participants in the prescriptive sales processes; physicians and pharmaceutical sales representatives. The findings indicate differences between physicians and pharmaceutical sales representatives regarding the efficacy and ethical considerations of various promotional strategies.

  4. Detection and characterization of uranium-humic complexes during 1D transport studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lesher, Emily K.; Honeyman, Bruce D.; Ranville, James F.

    2013-05-01

    The speciation and transport of uranium (VI) through porous media is highly dependent on solution conditions, the presence of complexing ligands, and the nature of the porous media. The dependency on many variables makes prediction of U transport in bench-scale experiments and in the field difficult. In particular, the identification of colloidal U phases poses a technical challenge. Transport of U in the presence and absence of natural organic matter (Suwannee River humic acid, SRHA) through silica sand and hematite coated silica sand was tested at pH 4 and 5 using static columns, where flow is controlled by gravity andmore » residence time between advective pore volume exchanges can be strictly controlled. The column effluents were characterized by traditional techniques including ICPMS quantification of total [U] and [Fe], TOC analysis of [DOC], and pH analysis, and also by non-traditional techniques: flow field flow fractionation with online ICPMS detection (FlFFF-ICPMS) and specific UV absorbance (SUVA) characterization of effluent fractions. Key results include that the transport of U through the columns was enhanced by pre-equilibration with SRHA, and previously deposited U was remobilized by the addition of SRHA. The advanced techniques yielded important insights on the mechanisms of transport: FlFFF-ICPMS identified a U-SRHA complex as the mobile U species and directly quantified relative amounts of the complex, while specific UV absorbance (SUVA) measurements indicated a composition-based fractionation onto the porous media.« less

  5. Pre-Nursing Students Perceptions of Traditional and Inquiry Based Chemistry Laboratories

    NASA Astrophysics Data System (ADS)

    Rogers, Jessica

    This paper describes a process that attempted to meet the needs of undergraduate students in a pre-nursing chemistry class. The laboratory was taught in traditional verification style and students were surveyed to assess their perceptions of the educational goals of the laboratory. A literature review resulted in an inquiry based method and analysis of the needs of nurses resulted in more application based activities. This new inquiry format was implemented the next semester, the students were surveyed at the end of the semester and results were compared to the previous method. Student and instructor response to the change in format was positive. Students in the traditional format placed goals concerning technique above critical thinking and felt the lab was easy to understand and carry out. Students in the inquiry based lab felt they learned more critical thinking skills and enjoyed the independence of designing experiments and answering their own questions.

  6. Culture, ritual, and errors of repudiation: some implications for the assessment of alternative medical traditions.

    PubMed

    Trotter, G

    2000-07-01

    In this article, sources of error that are likely involved when alternative medical traditions are assessed from the standpoint of orthodox biomedicine are discussed. These sources include (1) biomedicine's implicit reductive materialism (manifested in its negative orientation toward placebo effects), (2) a related bias against ritual, and (3) cultural barriers to the construction of externally valid protocols. To overcome these biases, investigators must attend to ritualistic elements in alternative treatments and should recruit patients from appropriate cultural groups. Collaborative research may be the key. Benefits of collaborative research include (1) increased mutual respect and integration between culturally distinct groups and practices, (2) increased understanding and use of sophisticated techniques of empirical analysis among practitioners from the alternative traditions, (3) increased appropriation of the therapeutic benefits of ritual, and (4) enhanced overall benefit for patients of all cultural backgrounds.

  7. Simple lock-in detection technique utilizing multiple harmonics for digital PGC demodulators.

    PubMed

    Duan, Fajie; Huang, Tingting; Jiang, Jiajia; Fu, Xiao; Ma, Ling

    2017-06-01

    A simple lock-in detection technique especially suited for digital phase-generated carrier (PGC) demodulators is proposed in this paper. It mixes the interference signal with rectangular waves whose Fourier expansions contain multiple odd or multiple even harmonics of the carrier to recover the quadrature components needed for interference phase demodulation. In this way, the use of a multiplier is avoided and the efficiency of the algorithm is improved. Noise performance with regard to light intensity variation and circuit noise is analyzed theoretically for both the proposed technique and the traditional lock-in technique, and results show that the former provides a better signal-to-noise ratio than the latter with proper modulation depth and average interference phase. Detailed simulations were conducted and the theoretical analysis was verified. A fiber-optic Michelson interferometer was constructed and the feasibility of the proposed technique is demonstrated.

  8. Analysis of Proportional Integral and Optimized Proportional Integral Controllers for Resistance Spot Welding System (RSWS) - A Performance Perspective

    NASA Astrophysics Data System (ADS)

    Rama Subbanna, S.; Suryakalavathi, M., Dr.

    2017-08-01

    This paper is an attempt to accomplish a performance analysis of the different control techniques on spikes reduction method applied on the medium frequency transformer based DC spot welding system. Spike reduction is an important factor to be considered while spot welding systems are concerned. During normal RSWS operation welding transformer’s magnetic core can become saturated due to the unbalanced resistances of both transformer secondary windings and different characteristics of output rectifier diodes, which causes current spikes and over-current protection switch-off of the entire system. The current control technique is a piecewise linear control technique that is inspired from the DC-DC converter control algorithms to register a novel spike reduction method in the MFDC spot welding applications. Two controllers that were used for the spike reduction portion of the overall applications involve the traditional PI controller and Optimized PI controller. Care is taken such that the current control technique would maintain a reduced spikes in the primary current of the transformer while it reduces the Total Harmonic Distortion. The performance parameter that is involved in the spikes reduction technique is the THD, Percentage of current spike reduction for both techniques. Matlab/SimulinkTM based simulation is carried out for the MFDC RSWS with KW and results are tabulated for the PI and Optimized PI controllers and a tradeoff analysis is carried out.

  9. Online differentiation of mineral phase in aerosol particles by ion formation mechanism using a LAAP-TOF single-particle mass spectrometer

    NASA Astrophysics Data System (ADS)

    Marsden, Nicholas A.; Flynn, Michael J.; Allan, James D.; Coe, Hugh

    2018-01-01

    Mineralogy of silicate mineral dust has a strong influence on climate and ecosystems due to variation in physiochemical properties that result from differences in composition and crystal structure (mineral phase). Traditional offline methods of analysing mineral phase are labour intensive and the temporal resolution of the data is much longer than many atmospheric processes. Single-particle mass spectrometry (SPMS) is an established technique for the online size-resolved measurement of particle composition by laser desorption ionisation (LDI) followed by time-of-flight mass spectrometry (TOF-MS). Although non-quantitative, the technique is able to identify the presence of silicate minerals in airborne dust particles from markers of alkali metals and silicate molecular ions in the mass spectra. However, the differentiation of mineral phase in silicate particles by traditional mass spectral peak area measurements is not possible. This is because instrument function and matrix effects in the ionisation process result in variations in instrument response that are greater than the differences in composition between common mineral phases.In this study, we introduce a novel technique that enables the differentiation of mineral phase in silicate mineral particles by ion formation mechanism measured from subtle changes in ion arrival times at the TOF-MS detector. Using a combination of peak area and peak centroid measurements, we show that the arrangement of the interstitial alkali metals in the crystal structure, an important property in silicate mineralogy, influences the ion arrival times of elemental and molecular ion species in the negative ion mass spectra. A classification scheme is presented that allowed for the differentiation of illite-smectite, kaolinite and feldspar minerals on a single-particle basis. Online analysis of mineral dust aerosol generated from clay mineral standards produced mineral fractions that are in agreement with bulk measurements reported by traditional XRD (X-ray diffraction) analysis.

  10. Electronic surveys: how to maximise success.

    PubMed

    McPeake, Joanne; Bateson, Meghan; O'Neill, Anna

    2014-01-01

    To draw on the researchers' experience of developing and distributing a UK-wide electronic survey. The evolution of electronic surveys in healthcare research will be discussed, as well as simple techniques that can be used to improve response rates for this type of data collection. There is an increasing use of electronic survey methods in healthcare research. However, in recent published research, electronic surveys have had lower response rates than traditional survey methods, such as postal and telephone surveys. This is a methodology paper. Electronic surveys have many advantages over traditional surveys, including a reduction in cost and ease of analysis. Drawbacks to this type of data collection include the potential for selection bias and poorer response rates. However, research teams can use a range of simple strategies to boost response rates. These approaches target the different stages of achieving a complete response: initial attraction through personalisation, engagement by having an easily accessible link to the survey, and transparency of survey length and completion though targeting the correct, and thereby interested, population. The fast, efficient and often 'free' electronic survey has many advantages over the traditional postal data collection method, including ease of analysis for what can be vast amounts of data. However, to capitalise on these benefits, researchers must carefully consider techniques to maximise response rates and minimise selection bias for their target population. Researchers can use a range of strategies to improve responses from electronic surveys, including sending up to three reminders, personalising each email, adding the updated response rate to reminder emails, and stating the average time it would take to complete the survey in the title of the email.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, W. James; Albertson, R Craig; Jacob, Rick E.

    Here we present a re-description of Abudefduf luridus and reassign it to the genus Similiparma. We supplement traditional diagnoses and descriptions of this species with quantitative anatomical data collected from a family-wide geometric morphometric analysis of head morphology (44 species representing all 30 damselfish genera) and data from cranial micro-CT scans of fishes in the genus Similiparma. The use of geometric morphometric analyses (and other methods of shape analysis) permits detailed comparisons between the morphology of specific taxa and the anatomical diversity that has arisen in an entire lineage. This provides a particularly useful supplement to traditional description methods andmore » we recommend the use of such techniques by systematists. Similiparma and its close relatives constitute a branch of the damselfish phylogenetic tree that predominantly inhabits rocky reefs in the Atlantic and Eastern Pacific, as opposed to the more commonly studied damselfishes that constitute a large portion of the ichthyofauna on all coral-reef communities.« less

  12. Decorin Content and Near Infrared Spectroscopy Analysis of Dried Collagenous Biomaterial Samples

    PubMed Central

    Aldema-Ramos, Mila L.; Castell, Joan Carles; Muir, Zerlina E.; Adzet, Jose Maria; Sabe, Rosa; Schreyer, Suzanne

    2012-01-01

    The efficient removal of proteoglycans, such as decorin, from the hide when processing it to leather by traditional means is generally acceptable and beneficial for leather quality, especially for softness and flexibility. A patented waterless or acetone dehydration method that can generate a product similar to leather called Dried Collagenous Biomaterial (known as BCD) was developed but has no effect on decorin removal efficiency. The Alcian Blue colorimetric technique was used to assay the sulfated glycosaminoglycan (sGAG) portion of decorin. The corresponding residual decorin content was correlated to the mechanical properties of the BCD samples and was comparable to the control leather made traditionally. The waterless dehydration and instantaneous chrome tanning process is a good eco-friendly alternative to transforming hides to leather because no additional effects were observed after examination using NIR spectroscopy and additional chemometric analysis. PMID:24970152

  13. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA

    NASA Astrophysics Data System (ADS)

    Coughlan, Michael R.

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  14. Wildland Arson as Clandestine Resource Management: A Space-Time Permutation Analysis and Classification of Informal Fire Management Regimes in Georgia, USA.

    PubMed

    Coughlan, Michael R

    2016-05-01

    Forest managers are increasingly recognizing the value of disturbance-based land management techniques such as prescribed burning. Unauthorized, "arson" fires are common in the southeastern United States where a legacy of agrarian cultural heritage persists amidst an increasingly forest-dominated landscape. This paper reexamines unauthorized fire-setting in the state of Georgia, USA from a historical ecology perspective that aims to contribute to historically informed, disturbance-based land management. A space-time permutation analysis is employed to discriminate systematic, management-oriented unauthorized fires from more arbitrary or socially deviant fire-setting behaviors. This paper argues that statistically significant space-time clusters of unauthorized fire occurrence represent informal management regimes linked to the legacy of traditional land management practices. Recent scholarship has pointed out that traditional management has actively promoted sustainable resource use and, in some cases, enhanced biodiversity often through the use of fire. Despite broad-scale displacement of traditional management during the 20th century, informal management practices may locally circumvent more formal and regionally dominant management regimes. Space-time permutation analysis identified 29 statistically significant fire regimes for the state of Georgia. The identified regimes are classified by region and land cover type and their implications for historically informed disturbance-based resource management are discussed.

  15. Meta-analysis of the clinical application on gasless laparoscopic cholecystectomy in China

    PubMed Central

    Liu, Qian; Zhang, Guangyong; Zhong, Yong; Duan, Chongyang; Hu, Sanyuan

    2015-01-01

    Objective: We aim to perform systematic reviews of the clinical effects of the abdominal wall suspension technique in laparoscopic cholecystectomy in China. Methods: We retrieved databases of literature on randomized controlled trials involving abdominal wall suspension laparoscopic cholecystectomy. Then, we conducted screenings, extracted data, and performed quality assessment and meta-analysis. Results: We analyzed 611 patients. Our analysis showed that the abdominal wall suspension group compared to the traditional group had reduced length of hospital stay (SMD = -0.91, 95% CI = -1.76~-0.06, P = 0.04), had shortened postoperative first exhaust time (SMD = -0.65, 95% CI = -1.11~-0.20, P = 0.005), and had diminished incidence of postoperative complications (P < 0.001), which decreased the cost of hospitalization. Conclusions: Application of abdominal wall suspension endoscopic technique can significantly speed up the rehabilitation of laparoscopic cholecystectomy patients; therefore, it is worthy of further research and clinical application. PMID:25932097

  16. Advances in Analysis of Human Milk Oligosaccharides123

    PubMed Central

    Ruhaak, L. Renee; Lebrilla, Carlito B.

    2012-01-01

    Oligosaccharides in human milk strongly influence the composition of the gut microflora of neonates. Because it is now clear that the microflora play important roles in the development of the infant immune system, human milk oligosaccharides (HMO) are studied frequently. Milk samples contain complex mixtures of HMO, usually comprising several isomeric structures that can be either linear or branched. Traditionally, HMO profiling was performed using HPLC with fluorescence or UV detection. By using porous graphitic carbon liquid chromatography MS, it is now possible to separate and identify most of the isomers, facilitating linkage-specific analysis. Matrix-assisted laser desorption ionization time-of-flight analysis allows fast profiling, but does not allow isomer separation. Novel MS fragmentation techniques have facilitated structural characterization of HMO that are present at lower concentrations. These techniques now facilitate more accurate studies of HMO consumption as well as Lewis blood group determinations. PMID:22585919

  17. Exaggerated heart rate oscillations during two meditation techniques.

    PubMed

    Peng, C K; Mietus, J E; Liu, Y; Khalsa, G; Douglas, P S; Benson, H; Goldberger, A L

    1999-07-31

    We report extremely prominent heart rate oscillations associated with slow breathing during specific traditional forms of Chinese Chi and Kundalini Yoga meditation techniques in healthy young adults. We applied both spectral analysis and a novel analytic technique based on the Hilbert transform to quantify these heart rate dynamics. The amplitude of these oscillations during meditation was significantly greater than in the pre-meditation control state and also in three non-meditation control groups: i) elite athletes during sleep, ii) healthy young adults during metronomic breathing, and iii) healthy young adults during spontaneous nocturnal breathing. This finding, along with the marked variability of the beat-to-beat heart rate dynamics during such profound meditative states, challenges the notion of meditation as only an autonomically quiescent state.

  18. Noise-band factor analysis of cancer Fourier transform infrared evanescent-wave fiber optical (FTIR-FEW) spectra

    NASA Astrophysics Data System (ADS)

    Sukuta, Sydney; Bruch, Reinhard F.

    2002-05-01

    The goal of this study is to test the feasibility of using noise factor/eigenvector bands as general clinical analytical tools for diagnoses. We developed a new technique, Noise Band Factor Cluster Analysis (NBFCA), to diagnose benign tumors via their Fourier transform IR fiber optic evanescent wave spectral data for the first time. The middle IR region of human normal skin tissue and benign and melanoma tumors, were analyzed using this new diagnostic technique. Our results are not in full-agreement with pathological classifications hence there is a possibility that our approaches could complement or improve these traditional classification schemes. Moreover, the use of NBFCA make it much easier to delineate class boundaries hence this method provides results with much higher certainty.

  19. Interferogram conditioning for improved Fourier analysis and application to X-ray phase imaging by grating interferometry.

    PubMed

    Montaux-Lambert, Antoine; Mercère, Pascal; Primot, Jérôme

    2015-11-02

    An interferogram conditioning procedure, for subsequent phase retrieval by Fourier demodulation, is presented here as a fast iterative approach aiming at fulfilling the classical boundary conditions imposed by Fourier transform techniques. Interference fringe patterns with typical edge discontinuities were simulated in order to reveal the edge artifacts that classically appear in traditional Fourier analysis, and were consecutively used to demonstrate the correction efficiency of the proposed conditioning technique. Optimization of the algorithm parameters is also presented and discussed. Finally, the procedure was applied to grating-based interferometric measurements performed in the hard X-ray regime. The proposed algorithm enables nearly edge-artifact-free retrieval of the phase derivatives. A similar enhancement of the retrieved absorption and fringe visibility images is also achieved.

  20. Measurement of Galactic Logarithmic Spiral Arm Pitch Angle Using Two-dimensional Fast Fourier Transform Decomposition

    NASA Astrophysics Data System (ADS)

    Davis, Benjamin L.; Berrier, Joel C.; Shields, Douglas W.; Kennefick, Julia; Kennefick, Daniel; Seigar, Marc S.; Lacy, Claud H. S.; Puerari, Ivânio

    2012-04-01

    A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing two-dimensional fast Fourier transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow comparison of spiral galaxy pitch angle to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques.

  1. Trace elements content in the selected medicinal plants traditionally used for curing skin diseases by the natives of Mizoram, India.

    PubMed

    Rajan, Jay Prakash; Singh, Kshetrimayum Birla; Kumar, Sanjiv; Mishra, Raj Kumar

    2014-09-01

    To determine the trace elements content in the selected medicinal plants, namely, Eryngium foetidum L., Mimosa pudica L., Polygonum plebeium, and Prunus cerasoides D. Don traditionally used by the natives of the Mizoram, one of the north eastern states in India as their folklore medicines for curing skin diseases like eczema, leg and fingers infection, swelling and wound. A 3 MeV proton beam of proton induced X-ray emission technique, one of the most powerful techniques for its quick multi elemental trace analysis capability and high sensitivity was used to detect and characterized for trace elements. The studies revealed that six trace elements, namely, Fe, Zn, Cu, Mn, V, and Co detected in mg/L unit were present in varying concentrations in the selected medicinal plants with high and notable concentration of Fe, Zn, Mn and appreciable amount of the Cu, Co and V in all the plants. The results of the present study support the therapeutic usage of these medicinal plants in the traditional practices for curing skin diseases since they are found to contain appreciable amount of the Fe, Zn, Cu, Mn, V and Co. Copyright © 2014 Hainan Medical College. Published by Elsevier B.V. All rights reserved.

  2. Human Fear Chemosignaling: Evidence from a Meta-Analysis.

    PubMed

    de Groot, Jasper H B; Smeets, Monique A M

    2017-10-01

    Alarm pheromones are widely used in the animal kingdom. Notably, there are 26 published studies (N = 1652) highlighting a human capacity to communicate fear, stress, and anxiety via body odor from one person (66% males) to another (69% females). The question is whether the findings of this literature reflect a true effect, and what the average effect size is. These questions were answered by combining traditional meta-analysis with novel meta-analytical tools, p-curve analysis and p-uniform-techniques that could indicate whether findings are likely to reflect a true effect based on the distribution of P-values. A traditional random-effects meta-analysis yielded a small-to-moderate effect size (Hedges' g: 0.36, 95% CI: 0.31-0.41), p-curve analysis showed evidence diagnostic of a true effect (ps < 0.0001), and there was no evidence for publication bias. This meta-analysis did not assess the internal validity of the current studies; yet, the combined results illustrate the statistical robustness of a field in human olfaction dealing with the human capacity to communicate certain emotions (fear, stress, anxiety) via body odor. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. Industrial application of green chromatography--I. Separation and analysis of niacinamide in skincare creams using pure water as the mobile phase.

    PubMed

    Yang, Yu; Strickland, Zackary; Kapalavavi, Brahmam; Marple, Ronita; Gamsky, Chris

    2011-03-15

    In this work, chromatographic separation of niacin and niacinamide using pure water as the sole component in the mobile phase has been investigated. The separation and analysis of niacinamide have been optimized using three columns at different temperatures and various flow rates. Our results clearly demonstrate that separation and analysis of niacinamide from skincare products can be achieved using pure water as the eluent at 60°C on a Waters XTerra MS C18 column, a Waters XBridge C18 column, or at 80°C on a Hamilton PRP-1 column. The separation efficiency, quantification quality, and analysis time of this new method are at least comparable with those of the traditional HPLC methods. Compared with traditional HPLC, the major advantage of this newly developed green chromatography technique is the elimination of organic solvents required in the HPLC mobile phase. In addition, the pure water chromatography separations described in this work can be directly applied in industrial plant settings without further modification of the existing HPLC equipment. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Traditional Agriculture and Permaculture.

    ERIC Educational Resources Information Center

    Pierce, Dick

    1997-01-01

    Discusses benefits of combining traditional agricultural techniques with the concepts of "permaculture," a framework for revitalizing traditions, culture, and spirituality. Describes school, college, and community projects that have assisted American Indian communities in revitalizing sustainable agricultural practices that incorporate…

  5. Spectroscopic analysis technique for arc-welding process control

    NASA Astrophysics Data System (ADS)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  6. Acquiring, recording, and analyzing pathology data from experimental mice: an overview.

    PubMed

    Scudamore, Cheryl L

    2014-03-21

    Pathology is often underutilized as an end point in mouse studies in academic research because of a lack of experience and expertise. The use of traditional pathology techniques including necropsy and microscopic analysis can be useful in identifying the basic processes underlying a phenotype and facilitating comparison with equivalent human diseases. This overview aims to provide a guide and reference to the acquisition, recording, and analysis of high-quality pathology data from experimental mice in an academic research setting. Copyright © 2014 John Wiley & Sons, Inc.

  7. Spectral analysis of bacanora (agave-derived liquor) by using FT-Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Ortega Clavero, Valentin; Weber, Andreas; Schröder, Werner; Curticapean, Dan

    2016-04-01

    The industry of the agave-derived bacanora, in the northern Mexican state of Sonora, has been growing substantially in recent years. However, this higher demand still lies under the influences of a variety of social, legal, cultural, ecological and economic elements. The governmental institutions of the state have tried to encourage a sustainable development and certain levels of standardization in the production of bacanora by applying different economical and legal strategies. However, a large portion of this alcoholic beverage is still produced in a traditional and rudimentary fashion. Beyond the quality of the beverage, the lack of proper control, by using adequate instrumental methods, might represent a health risk, as in several cases traditional-distilled beverages can contain elevated levels of harmful materials. The present article describes the qualitative spectral analysis of samples of the traditional-produced distilled beverage bacanora in the range from 0 cm-1 to 3500 cm-1 by using a Fourier Transform Raman spectrometer. This particular technique has not been previously explored for the analysis of bacanora, as in the case of other beverages, including tequila. The proposed instrumental arrangement for the spectral analysis has been built by combining conventional hardware parts (Michelson interferometer, photo-diodes, visible laser, etc.) and a set of self-developed evaluation algorithms. The resulting spectral information has been compared to those of pure samples of ethanol and to the spectra from different samples of the alcoholic beverage tequila. The proposed instrumental arrangement can be used the analysis of bacanora.

  8. Portable Wireless LAN Device and Two-Way Radio Threat Assessment for Aircraft VHF Communication Radio Band

    NASA Technical Reports Server (NTRS)

    Nguyen, Truong X.; Koppen, Sandra V.; Ely, Jay J.; Williams, Reuben A.; Smith, Laura J.; Salud, Maria Theresa P.

    2004-01-01

    This document summarizes the safety analysis performed on a Flight Guidance System (FGS) requirements model. In particular, the safety properties desired of the FGS model are identified and the presence of the safety properties in the model is formally verified. Chapter 1 provides an introduction to the entire project, while Chapter 2 gives a brief overview of the problem domain, the nature of accidents, model based development, and the four-variable model. Chapter 3 outlines the approach. Chapter 4 presents the results of the traditional safety analysis techniques and illustrates how the hazardous conditions associated with the system trace into specific safety properties. Chapter 5 presents the results of the formal methods analysis technique model checking that was used to verify the presence of the safety properties in the requirements model. Finally, Chapter 6 summarizes the main conclusions of the study, first and foremost that model checking is a very effective verification technique to use on discrete models with reasonable state spaces. Additional supporting details are provided in the appendices.

  9. Fast algorithm for spectral processing with application to on-line welding quality assurance

    NASA Astrophysics Data System (ADS)

    Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.

    2006-10-01

    A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.

  10. Classroom Assessment Techniques: Checking for Student Understanding in an Introductory University Success Course

    ERIC Educational Resources Information Center

    Holbeck, Rick; Bergquist, Emily; Lees, Sheila

    2014-01-01

    Classroom Assessment Techniques (CATs) have been used in traditional university classrooms as a strategy to check for student understanding (Angelo & Cross, 1993). With the emergence of online learning and its popularity for non-traditional students, it is equally important that instructors in the online environment check for student…

  11. Which is the preferred revision technique for loosened iliac screw? A novel technique of boring cement injection from the outer cortical shell.

    PubMed

    Yu, Bin-Sheng; Yang, Zhan-Kun; Li, Ze-Min; Zeng, Li-Wen; Wang, Li-Bing; Lu, William Weijia

    2011-08-01

    An in vitro biomechanical cadaver study. To evaluate the pull-out strength after 5000 cyclic loading among 4 revision techniques for the loosened iliac screw using corticocancellous bone, longer screw, traditional cement augmentation, and boring cement augmentation. Iliac screw loosening is still a clinical problem for lumbo-iliac fusion. Although many revision techniques using corticocancellous bone, larger screw, and polymethylmethacrylate (PMMA) augmentation were applied in repairing pedicle screw loosening, their biomechanical effects on the loosened iliac screw remain undetermined. Eight fresh human cadaver pelvises with the bone mineral density values ranging from 0.83 to 0.97 g/cm were adopted in this study. After testing the primary screw of 7.5 mm diameter and 70 mm length, 4 revision techniques were sequentially established and tested on the same pelvis as follows: corticocancellous bone, longer screw with 100 mm length, traditional PMMA augmentation, and boring PMMA augmentation. The difference of the boring technique from traditional PMMA augmentation is that PMMA was injected into the screw tract through 3 boring holes of outer cortical shell without removing the screw. On an MTS machine, after 5000 cyclic compressive loading of -200∼-500 N to the screw head, axial maximum pull-out strengths of the 5 screws were measured and analyzed. The pull-out strengths of the primary screw and 4 revised screws with corticocancellous bone, longer screw and traditional and boring PMMA augmentation were 1167 N, 361 N, 854 N, 1954 N, and 1820 N, respectively. Although longer screw method obtained significantly higher pull-out strength than corticocancellous bone (P<0.05), the revised screws using these 2 techniques exhibited notably lower pull-out strength than the primary screw and 2 PMMA-augmented screws (P<0.05). Either traditional or boring PMMA screw showed obviously higher pull-out strength than the primary screw (P<0.05); however, no significant difference of pull-out strength was detected between the 2 PMMA screws (P>0.05). Wadding corticocancellous bone and increasing screw length failed to provide sufficient anchoring strength for a loosened iliac screw; however, both traditional and boring PMMA-augmented techniques could effectively increase the fixation strength. On the basis of the viewpoint of minimal invasion, the boring PMMA augmentation may serve as a suitable salvage technique for iliac screw loosening.

  12. Traditional Chinese food technology and cuisine.

    PubMed

    Li, Jian-rong; Hsieh, Yun-Hwa P

    2004-01-01

    From ancient wisdom to modern science and technology, Chinese cuisine has been established from a long history of the country and gained a global reputation of its sophistication. Traditional Chinese foods and cuisine that exhibit Chinese culture, art and reality play an essential role in Chinese people's everyday lives. Recently, traditional Chinese foods have drawn a great degree of attention from food scientists and technologists, the food industry, and health promotion institutions worldwide due to the extensive values they offer beyond being merely another ethnic food. These traditional foods comprise a wide variety of products, such as pickled vegetables, salted fish and jellyfish, tofu and tofu derived products, rice and rice snack foods, fermented sauces, fish balls and thousand-year-old eggs. An overview of selected popular traditional Chinese foods and their processing techniques are included in this paper. Further development of the traditional techniques for formulation and production of these foods is expected to produce economic, social and health benefits.

  13. Introduction to bioengineering: melding of engineering and biological sciences.

    PubMed

    Shoureshi, Rahmat A

    2005-04-01

    Engineering has traditionally focused on the external extensions of organisms, such as transportation systems, high-rise buildings, and entertainment systems. In contrast, bioengineering is concerned with inward processes of biologic organisms. Utilization of engineering principles and techniques in the analysis and solution of problems in medicine and biology is the basis for bioengineering. This article discusses subspecialties in bioengineering and presents examples of projects in this discipline.

  14. Insights into the microbial diversity and community dynamics of Chinese traditional fermented foods from using high-throughput sequencing approaches*

    PubMed Central

    He, Guo-qing; Liu, Tong-jie; Sadiq, Faizan A.; Gu, Jing-si; Zhang, Guo-hua

    2017-01-01

    Chinese traditional fermented foods have a very long history dating back thousands of years and have become an indispensable part of Chinese dietary culture. A plethora of research has been conducted to unravel the composition and dynamics of microbial consortia associated with Chinese traditional fermented foods using culture-dependent as well as culture-independent methods, like different high-throughput sequencing (HTS) techniques. These HTS techniques enable us to understand the relationship between a food product and its microbes to a greater extent than ever before. Considering the importance of Chinese traditional fermented products, the objective of this paper is to review the diversity and dynamics of microbiota in Chinese traditional fermented foods revealed by HTS approaches. PMID:28378567

  15. How to Define the Mean Square Amplitude of Solar Wind Fluctuations With Respect to the Local Mean Magnetic Field

    NASA Astrophysics Data System (ADS)

    Podesta, John J.

    2017-12-01

    Over the last decade it has become popular to analyze turbulent solar wind fluctuations with respect to a coordinate system aligned with the local mean magnetic field. This useful analysis technique has provided new information and new insights about the nature of solar wind fluctuations and provided some support for phenomenological theories of MHD turbulence based on the ideas of Goldreich and Sridhar. At the same time it has drawn criticism suggesting that the use of a scale-dependent local mean field is somehow inconsistent or irreconcilable with traditional analysis techniques based on second-order structure functions and power spectra that, for stationary time series, are defined with respect to the constant (scale-independent) ensemble average magnetic field. Here it is shown that for fluctuations with power law spectra, such as those observed in solar wind turbulence, it is possible to define the local mean magnetic field in a special way such that the total mean square amplitude (trace amplitude) of turbulent fluctuations is approximately the same, scale by scale, as that obtained using traditional second-order structure functions or power spectra. This fact should dispel criticism concerning the physical validity or practical usefulness of the local mean magnetic field in these applications.

  16. Microbial composition of the Korean traditional food "kochujang" analyzed by a massive sequencing technique.

    PubMed

    Nam, Young-Do; Park, So-lim; Lim, Seong-Il

    2012-04-01

    Kochujang is a traditional Korean fermented food that is made with red pepper, glutinous rice, salt, and soybean. Kochujang is fermented by naturally occurring microorganisms through which it obtains various health-promoting properties. In this study, the bacterial diversities of 9 local and 2 commercial brands of kochujang were analyzed with a barcoded pyrosequencing technique targeting the hyper-variable regions V1/V2 of the 16S rRNA gene. Through the analysis of 13524 bacterial pyrosequences, 223 bacterial species were identified, most of which converged on the phylum Firmicutes (average 93.1%). All of the kochujang samples were largely populated (>90.9% of abundance) by 12 bacterial families, and Bacillaceae showed the highest abundance in all but one sample. Bacillus subtilis and B. licheniformis were the most dominant bacterial species and were broadly distributed among the kochujang samples. Each sample contained a high abundance of region-specific bacterial species, such as B. sonorensis, B. pumilus, Weissella salipiscis, and diverse unidentified Bacillus species. Phylotype- and phylogeny-based community comparison analysis showed that the microbial communities of the two commercial brands were different from those of the local brands. Moreover, each local brand kochujang sample had region-specific microbial community reflecting the manufacturing environment. © 2012 Institute of Food Technologists®

  17. A depth-of-field limited particle image velocimetry technique applied to oscillatory boundary layer flow over a porous bed

    NASA Astrophysics Data System (ADS)

    Lara, J. L.; Cowen, E. A.; Sou, I. M.

    2002-06-01

    Boundary layer flows are ubiquitous in the environment, but their study is often complicated by their thinness, geometric irregularity and boundary porosity. In this paper, we present an approach to making laboratory-based particle image velocimetry (PIV) measurements in these complex flow environments. Clear polycarbonate spheres were used to model a porous and rough bed. The strong curvature of the spheres results in a diffuse volume illuminated region instead of the more traditional finite and thin light sheet illuminated region, resulting in the imaging of both in-focus and significantly out-of-focus particles. Results of a traditional cross-correlation-based PIV-type analysis of these images demonstrate that the mean and turbulent features of an oscillatory boundary layer driven by a free-surface wave over an irregular-shaped porous bed can be robustly measured. Measurements of the mean flow, turbulent intensities, viscous and turbulent stresses are presented and discussed. Velocity spectra have been calculated showing an inertial subrange confirming that the PIV analysis is sufficiently robust to extract turbulence. The presented technique is particularly well suited for the study of highly dynamic free-surface flows that prevent the delivery of the light sheet from above the bed, such as swash flows.

  18. Towards scar-free surgery: An analysis of the increasing complexity from laparoscopic surgery to NOTES

    PubMed Central

    Chellali, Amine; Schwaitzberg, Steven D.; Jones, Daniel B.; Romanelli, John; Miller, Amie; Rattner, David; Roberts, Kurt E.; Cao, Caroline G.L.

    2014-01-01

    Background NOTES is an emerging technique for performing surgical procedures, such as cholecystectomy. Debate about its real benefit over the traditional laparoscopic technique is on-going. There have been several clinical studies comparing NOTES to conventional laparoscopic surgery. However, no work has been done to compare these techniques from a Human Factors perspective. This study presents a systematic analysis describing and comparing different existing NOTES methods to laparoscopic cholecystectomy. Methods Videos of endoscopic/laparoscopic views from fifteen live cholecystectomies were analyzed to conduct a detailed task analysis of the NOTES technique. A hierarchical task analysis of laparoscopic cholecystectomy and several hybrid transvaginal NOTES cholecystectomies was performed and validated by expert surgeons. To identify similarities and differences between these techniques, their hierarchical decomposition trees were compared. Finally, a timeline analysis was conducted to compare the steps and substeps. Results At least three variations of the NOTES technique were used for cholecystectomy. Differences between the observed techniques at the substep level of hierarchy and on the instruments being used were found. The timeline analysis showed an increase in time to perform some surgical steps and substeps in NOTES compared to laparoscopic cholecystectomy. Conclusion As pure NOTES is extremely difficult given the current state of development in instrumentation design, most surgeons utilize different hybrid methods – combination of endoscopic and laparoscopic instruments/optics. Results of our hierarchical task analysis yielded an identification of three different hybrid methods to perform cholecystectomy with significant variability amongst them. The varying degrees to which laparoscopic instruments are utilized to assist in NOTES methods appear to introduce different technical issues and additional tasks leading to an increase in the surgical time. The NOTES continuum of invasiveness is proposed here as a classification scheme for these methods, which was used to construct a clear roadmap for training and technology development. PMID:24902811

  19. Analysis of calibration data for the uranium active neutron coincidence counting collar with attention to errors in the measured neutron coincidence rate

    DOE PAGES

    Croft, Stephen; Burr, Thomas Lee; Favalli, Andrea; ...

    2015-12-10

    We report that the declared linear density of 238U and 235U in fresh low enriched uranium light water reactor fuel assemblies can be verified for nuclear safeguards purposes using a neutron coincidence counter collar in passive and active mode, respectively. The active mode calibration of the Uranium Neutron Collar – Light water reactor fuel (UNCL) instrument is normally performed using a non-linear fitting technique. The fitting technique relates the measured neutron coincidence rate (the predictor) to the linear density of 235U (the response) in order to estimate model parameters of the nonlinear Padé equation, which traditionally is used to modelmore » the calibration data. Alternatively, following a simple data transformation, the fitting can also be performed using standard linear fitting methods. This paper compares performance of the nonlinear technique to the linear technique, using a range of possible error variance magnitudes in the measured neutron coincidence rate. We develop the required formalism and then apply the traditional (nonlinear) and alternative approaches (linear) to the same experimental and corresponding simulated representative datasets. Lastly, we find that, in this context, because of the magnitude of the errors in the predictor, it is preferable not to transform to a linear model, and it is preferable not to adjust for the errors in the predictor when inferring the model parameters« less

  20. Line identification studies using traditional techniques and wavelength coincidence statistics

    NASA Technical Reports Server (NTRS)

    Cowley, Charles R.; Adelman, Saul J.

    1990-01-01

    Traditional line identification techniques result in the assignment of individual lines to an atomic or ionic species. These methods may be supplemented by wavelength coincidence statistics (WCS). The strength and weakness of these methods are discussed using spectra of a number of normal and peculiar B and A stars that have been studied independently by both methods. The present results support the overall findings of some earlier studies. WCS would be most useful in a first survey, before traditional methods have been applied. WCS can quickly make a global search for all species and in this way may enable identifications of an unexpected spectrum that could easily be omitted entirely from a traditional study. This is illustrated by O I. WCS is a subject to well known weakness of any statistical technique, for example, a predictable number of spurious results are to be expected. The danger of small number statistics are illustrated. WCS is at its best relative to traditional methods in finding a line-rich atomic species that is only weakly present in a complicated stellar spectrum.

  1. Direct Detection of Pharmaceuticals and Personal Care Products from Aqueous Samples with Thermally-Assisted Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  2. Direct detection of pharmaceuticals and personal care products from aqueous samples with thermally-assisted desorption electrospray ionization mass spectrometry.

    PubMed

    Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  3. Quick detection of traditional Chinese medicine ‘Atractylodis Macrocephalae Rhizoma’ pieces by surface-enhanced Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Hao; Shi, Hong; Feng, Shangyuan; Lin, Juqiang; Chen, Weiwei; Yu, Yun; Lin, Duo; Xu, Qian; Chen, Rong

    2013-01-01

    A surface-enhanced Raman spectroscopy (SERS) method was developed for the analysis of traditional Chinese medicine ‘Atractylodis Macrocephalae Rhizoma’ pieces (AMRP) for the first time with the aim to develop a quick method for traditional Chinese medicine detection. Both Raman spectra and SERS spectra were obtained from AMRP, and tentative assignments of the Raman bands in the measured spectra suggested that only a few weak Raman peaks could be observed in the regular Raman spectra, while primary Raman peaks at around 536, 555, 619, 648, 691, 733, 790, 958, 1004, 1031, 1112, 1244, 1324, 1395, 1469, 1574 and 1632 cm-1 could be observed in the SERS spectra, with the strongest signals at 619, 733, 958, 1324, 1395 and 1469 cm-1. This was due to a strong interaction between the silver colloids and the AMRP, which led to an extraordinary enhancement in the intensity of the Raman scattering in AMRP. This exploratory study suggests the SERS technique has great potential for providing a novel non-destructive method for effectively and accurately detecting traditional Chinese medicine without complicated separation and extraction.

  4. Network meta-analysis: a technique to gather evidence from direct and indirect comparisons

    PubMed Central

    2017-01-01

    Systematic reviews and pairwise meta-analyses of randomized controlled trials, at the intersection of clinical medicine, epidemiology and statistics, are positioned at the top of evidence-based practice hierarchy. These are important tools to base drugs approval, clinical protocols and guidelines formulation and for decision-making. However, this traditional technique only partially yield information that clinicians, patients and policy-makers need to make informed decisions, since it usually compares only two interventions at the time. In the market, regardless the clinical condition under evaluation, usually many interventions are available and few of them have been studied in head-to-head studies. This scenario precludes conclusions to be drawn from comparisons of all interventions profile (e.g. efficacy and safety). The recent development and introduction of a new technique – usually referred as network meta-analysis, indirect meta-analysis, multiple or mixed treatment comparisons – has allowed the estimation of metrics for all possible comparisons in the same model, simultaneously gathering direct and indirect evidence. Over the last years this statistical tool has matured as technique with models available for all types of raw data, producing different pooled effect measures, using both Frequentist and Bayesian frameworks, with different software packages. However, the conduction, report and interpretation of network meta-analysis still poses multiple challenges that should be carefully considered, especially because this technique inherits all assumptions from pairwise meta-analysis but with increased complexity. Thus, we aim to provide a basic explanation of network meta-analysis conduction, highlighting its risks and benefits for evidence-based practice, including information on statistical methods evolution, assumptions and steps for performing the analysis. PMID:28503228

  5. Optimal focal-plane restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1989-01-01

    Image restoration can be implemented efficiently by calculating the convolution of the digital image and a small kernel during image acquisition. Processing the image in the focal-plane in this way requires less computation than traditional Fourier-transform-based techniques such as the Wiener filter and constrained least-squares filter. Here, the values of the convolution kernel that yield the restoration with minimum expected mean-square error are determined using a frequency analysis of the end-to-end imaging system. This development accounts for constraints on the size and shape of the spatial kernel and all the components of the imaging system. Simulation results indicate the technique is effective and efficient.

  6. Using Movies to Analyse Gene Circuit Dynamics in Single Cells

    PubMed Central

    Locke, James CW; Elowitz, Michael B

    2010-01-01

    Preface Many bacterial systems rely on dynamic genetic circuits to control critical processes. A major goal of systems biology is to understand these behaviours in terms of individual genes and their interactions. However, traditional techniques based on population averages wash out critical dynamics that are either unsynchronized between cells or driven by fluctuations, or ‘noise,’ in cellular components. Recently, the combination of time-lapse microscopy, quantitative image analysis, and fluorescent protein reporters has enabled direct observation of multiple cellular components over time in individual cells. In conjunction with mathematical modelling, these techniques are now providing powerful insights into genetic circuit behaviour in diverse microbial systems. PMID:19369953

  7. Whole-Body Human Inverse Dynamics with Distributed Micro-Accelerometers, Gyros and Force Sensing †

    PubMed Central

    Latella, Claudia; Kuppuswamy, Naveen; Romano, Francesco; Traversaro, Silvio; Nori, Francesco

    2016-01-01

    Human motion tracking is a powerful tool used in a large range of applications that require human movement analysis. Although it is a well-established technique, its main limitation is the lack of estimation of real-time kinetics information such as forces and torques during the motion capture. In this paper, we present a novel approach for a human soft wearable force tracking for the simultaneous estimation of whole-body forces along with the motion. The early stage of our framework encompasses traditional passive marker based methods, inertial and contact force sensor modalities and harnesses a probabilistic computational technique for estimating dynamic quantities, originally proposed in the domain of humanoid robot control. We present experimental analysis on subjects performing a two degrees-of-freedom bowing task, and we estimate the motion and kinetics quantities. The results demonstrate the validity of the proposed method. We discuss the possible use of this technique in the design of a novel soft wearable force tracking device and its potential applications. PMID:27213394

  8. "The Poison That Ruined the Nation": Native American Men-Alcohol, Identity, and Traditional Healing.

    PubMed

    Matamonasa-Bennett, Arieahn

    2017-07-01

    Alcoholism and destructive drinking patterns are serious social problems in many Native American reservation and urban communities. This qualitative study of men from a single Great Lakes reservation community examined the social, cultural, and psychological aspects of their alcohol problems through their life stories. The men were in various stages of recovery and sobriety, and data collection consisted of open-ended interviews and analysis utilizing principles and techniques from grounded theory and ethnographic content analysis. Alcoholism and other serious social problems facing Native American communities need to be understood in the sociocultural and historical contexts of colonization and historical grief and trauma. This study suggests that for Native American men, there are culturally specific perspectives on alcohol that have important implications for prevention and treatment of alcohol abuse. The participants' narratives provided insight into the ways reconnecting with traditional cultural values (retraditionalization) helped them achieve sobriety. For these men, alcohol was highly symbolic of colonization as well as a protest to it. Alcohol was a means for affirming "Indian" identity and sobriety a means for reaffirming traditional tribal identity. Their narratives suggested the ways in which elements of traditional cultural values and practices facilitate healing in syncretic models and Nativized treatment. Understanding the ways in which specific Native cultural groups perceive their problems with drinking and sobriety can create more culturally congruent, culturally sensitive, and effective treatment approaches and inform future research.

  9. Traditional and Constructivist Teaching Techniques: Comparing Two Groups of Undergraduate Nonscience Majors in a Biology Lab

    ERIC Educational Resources Information Center

    Travis, Holly; Lord, Thomas

    2004-01-01

    Constructivist teaching techniques work well in various instructional settings, but many teachers remain skeptical because there is a lack of quantitative data supporting this model. This study compared an undergraduate nonmajors biology lab section taught in a traditional teacher-centered style to a similar section taught as a constructivist…

  10. Electromyographic evaluation in children orthodontically treated for skeletal Class II malocclusion: Comparison of two treatment techniques.

    PubMed

    Ortu, Eleonora; Pietropaoli, Davide; Adib, Fray; Masci, Chiara; Giannoni, Mario; Monaco, Annalisa

    2017-11-16

    Objective To compare the clinical efficacy of two techniques for fabricating a Bimler device by assessing the patient's surface electromyography (sEMG) activity at rest before treatment and six months after treatment. Methods Twenty-four patients undergoing orthodontic treatment were enrolled in the study; 12 formed the test group and wore a Bimler device fabricated with a Myoprint impression using neuromuscular orthodontic technique and 12 formed the control group and were treated by traditional orthodontic technique with a wax bite in protrusion. The "rest" sEMG of each patient was recorded prior to treatment and six months after treatment. Results The neuromuscular-designed Bimler device was more comfortable and provided better treatment results than the traditional Bimler device. Conclusion This study suggests that the patient group subjected to neuromuscular orthodontic treatment had a treatment outcome with more relaxed masticatory muscles and better function versus the traditional orthodontic treatment.

  11. Meta-Analyses of Diagnostic Accuracy in Imaging Journals: Analysis of Pooling Techniques and Their Effect on Summary Estimates of Diagnostic Accuracy.

    PubMed

    McGrath, Trevor A; McInnes, Matthew D F; Korevaar, Daniël A; Bossuyt, Patrick M M

    2016-10-01

    Purpose To determine whether authors of systematic reviews of diagnostic accuracy studies published in imaging journals used recommended methods for meta-analysis, and to evaluate the effect of traditional methods on summary estimates of sensitivity and specificity. Materials and Methods Medline was searched for published systematic reviews that included meta-analysis of test accuracy data limited to imaging journals published from January 2005 to May 2015. Two reviewers independently extracted study data and classified methods for meta-analysis as traditional (univariate fixed- or random-effects pooling or summary receiver operating characteristic curve) or recommended (bivariate model or hierarchic summary receiver operating characteristic curve). Use of methods was analyzed for variation with time, geographical location, subspecialty, and journal. Results from reviews in which study authors used traditional univariate pooling methods were recalculated with a bivariate model. Results Three hundred reviews met the inclusion criteria, and in 118 (39%) of those, authors used recommended meta-analysis methods. No change in the method used was observed with time (r = 0.54, P = .09); however, there was geographic (χ(2) = 15.7, P = .001), subspecialty (χ(2) = 46.7, P < .001), and journal (χ(2) = 27.6, P < .001) heterogeneity. Fifty-one univariate random-effects meta-analyses were reanalyzed with the bivariate model; the average change in the summary estimate was -1.4% (P < .001) for sensitivity and -2.5% (P < .001) for specificity. The average change in width of the confidence interval was 7.7% (P < .001) for sensitivity and 9.9% (P ≤ .001) for specificity. Conclusion Recommended methods for meta-analysis of diagnostic accuracy in imaging journals are used in a minority of reviews; this has not changed significantly with time. Traditional (univariate) methods allow overestimation of diagnostic accuracy and provide narrower confidence intervals than do recommended (bivariate) methods. (©) RSNA, 2016 Online supplemental material is available for this article.

  12. Finding Multi-scale Connectivity in Our Geospace Observational System: A New Perspective for Total Electron Content Data Through Network Analysis

    NASA Astrophysics Data System (ADS)

    McGranaghan, R. M.; Mannucci, A. J.; Verkhoglyadova, O. P.; Malik, N.

    2017-12-01

    How do we evolve beyond current traditional methods in order to innovate into the future? In what disruptive innovations will the next frontier of space physics and aeronomy (SPA) be grounded? We believe the answer to these compelling, yet equally challenging, questions lies in a shift of focus: from a narrow, field-specific view to a radically inclusive, interdisciplinary new modus operandi at the intersection of SPA and the information and data sciences. Concretely addressing these broader themes, we present results from a novel technique for knowledge discovery in the magnetosphere-ionosphere-thermosphere (MIT) system: complex network analysis (NA). We share findings from the first NA of ionospheric total electron content (TEC) data, including hemispheric and interplanetary magnetic field clock angle dependencies [1]. Our work shows that NA complements more traditional approaches for the investigation of TEC structure and dynamics, by both reaffirming well-established understanding, giving credence to the method, and identifying new connections, illustrating the exciting potential. We contextualize these new results through a discussion of the potential of data-driven discovery in the MIT system when innovative data science techniques are embraced. We address implications and potentially disruptive data analysis approaches for SPA in terms of: 1) the future of the geospace observational system; 2) understanding multi-scale phenomena; and 3) machine learning. [1] McGranaghan, R. M., A. J. Mannucci, O. Verkhoglyadova, and N. Malik (2017), Finding multiscale connectivity in our geospace observational system: Network analysis of total electron content, J. Geophys. Res. Space Physics, 122, doi:10.1002/2017JA024202.

  13. A review of costing methodologies in critical care studies.

    PubMed

    Pines, Jesse M; Fager, Samuel S; Milzman, David P

    2002-09-01

    Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.

  14. Multiscale adaptive analysis of circadian rhythms and intradaily variability: Application to actigraphy time series in acute insomnia subjects

    PubMed Central

    Rivera, Ana Leonor; Toledo-Roy, Juan C.; Ellis, Jason; Angelova, Maia

    2017-01-01

    Circadian rhythms become less dominant and less regular with chronic-degenerative disease, such that to accurately assess these pathological conditions it is important to quantify not only periodic characteristics but also more irregular aspects of the corresponding time series. Novel data-adaptive techniques, such as singular spectrum analysis (SSA), allow for the decomposition of experimental time series, in a model-free way, into a trend, quasiperiodic components and noise fluctuations. We compared SSA with the traditional techniques of cosinor analysis and intradaily variability using 1-week continuous actigraphy data in young adults with acute insomnia and healthy age-matched controls. The findings suggest a small but significant delay in circadian components in the subjects with acute insomnia, i.e. a larger acrophase, and alterations in the day-to-day variability of acrophase and amplitude. The power of the ultradian components follows a fractal 1/f power law for controls, whereas for those with acute insomnia this power law breaks down because of an increased variability at the 90min time scale, reminiscent of Kleitman’s basic rest-activity (BRAC) cycles. This suggests that for healthy sleepers attention and activity can be sustained at whatever time scale required by circumstances, whereas for those with acute insomnia this capacity may be impaired and these individuals need to rest or switch activities in order to stay focused. Traditional methods of circadian rhythm analysis are unable to detect the more subtle effects of day-to-day variability and ultradian rhythm fragmentation at the specific 90min time scale. PMID:28753669

  15. Microfluidics for the analysis of membrane proteins: how do we get there?

    PubMed

    Battle, Katrina N; Uba, Franklin I; Soper, Steven A

    2014-08-01

    The development of fully automated and high-throughput systems for proteomics is now in demand because of the need to generate new protein-based disease biomarkers. Unfortunately, it is difficult to identify protein biomarkers that are low abundant when in the presence of highly abundant proteins, especially in complex biological samples such as serum, cell lysates, and other biological fluids. Membrane proteins, which are in many cases of low abundance compared to the cytosolic proteins, have various functions and can provide insight into the state of a disease and serve as targets for new drugs making them attractive biomarker candidates. Traditionally, proteins are identified through the use of gel electrophoretic techniques, which are not always suitable for particular protein samples such as membrane proteins. Microfluidics offers the potential as a fully automated platform for the efficient and high-throughput analysis of complex samples, such as membrane proteins, and do so with performance metrics that exceed their bench-top counterparts. In recent years, there have been various improvements to microfluidics and their use for proteomic analysis as reported in the literature. Consequently, this review presents an overview of the traditional proteomic-processing pipelines for membrane proteins and insights into new technological developments with a focus on the applicability of microfluidics for the analysis of membrane proteins. Sample preparation techniques will be discussed in detail and novel interfacing strategies as it relates to MS will be highlighted. Lastly, some general conclusions and future perspectives are presented. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Non-Destructive Spectroscopic Techniques and Multivariate Analysis for Assessment of Fat Quality in Pork and Pork Products: A Review

    PubMed Central

    Kucha, Christopher T.; Liu, Li; Ngadi, Michael O.

    2018-01-01

    Fat is one of the most important traits determining the quality of pork. The composition of the fat greatly influences the quality of pork and its processed products, and contribute to defining the overall carcass value. However, establishing an efficient method for assessing fat quality parameters such as fatty acid composition, solid fat content, oxidative stability, iodine value, and fat color, remains a challenge that must be addressed. Conventional methods such as visual inspection, mechanical methods, and chemical methods are used off the production line, which often results in an inaccurate representation of the process because the dynamics are lost due to the time required to perform the analysis. Consequently, rapid, and non-destructive alternative methods are needed. In this paper, the traditional fat quality assessment techniques are discussed with emphasis on spectroscopic techniques as an alternative. Potential spectroscopic techniques include infrared spectroscopy, nuclear magnetic resonance and Raman spectroscopy. Hyperspectral imaging as an emerging advanced spectroscopy-based technology is introduced and discussed for the recent development of assessment for fat quality attributes. All techniques are described in terms of their operating principles and the research advances involving their application for pork fat quality parameters. Future trends for the non-destructive spectroscopic techniques are also discussed. PMID:29382092

  17. A review of recent developments in parametric based acoustic emission techniques applied to concrete structures

    NASA Astrophysics Data System (ADS)

    Vidya Sagar, R.; Raghu Prasad, B. K.

    2012-03-01

    This article presents a review of recent developments in parametric based acoustic emission (AE) techniques applied to concrete structures. It recapitulates the significant milestones achieved by previous researchers including various methods and models developed in AE testing of concrete structures. The aim is to provide an overview of the specific features of parametric based AE techniques of concrete structures carried out over the years. Emphasis is given to traditional parameter-based AE techniques applied to concrete structures. A significant amount of research on AE techniques applied to concrete structures has already been published and considerable attention has been given to those publications. Some recent studies such as AE energy analysis and b-value analysis used to assess damage of concrete bridge beams have also been discussed. The formation of fracture process zone and the AE energy released during the fracture process in concrete beam specimens have been summarised. A large body of experimental data on AE characteristics of concrete has accumulated over the last three decades. This review of parametric based AE techniques applied to concrete structures may be helpful to the concerned researchers and engineers to better understand the failure mechanism of concrete and evolve more useful methods and approaches for diagnostic inspection of structural elements and failure prediction/prevention of concrete structures.

  18. Bioinformatics/biostatistics: microarray analysis.

    PubMed

    Eichler, Gabriel S

    2012-01-01

    The quantity and complexity of the molecular-level data generated in both research and clinical settings require the use of sophisticated, powerful computational interpretation techniques. It is for this reason that bioinformatic analysis of complex molecular profiling data has become a fundamental technology in the development of personalized medicine. This chapter provides a high-level overview of the field of bioinformatics and outlines several, classic bioinformatic approaches. The highlighted approaches can be aptly applied to nearly any sort of high-dimensional genomic, proteomic, or metabolomic experiments. Reviewed technologies in this chapter include traditional clustering analysis, the Gene Expression Dynamics Inspector (GEDI), GoMiner (GoMiner), Gene Set Enrichment Analysis (GSEA), and the Learner of Functional Enrichment (LeFE).

  19. Enhanced orbit determination filter: Inclusion of ground system errors as filter parameters

    NASA Technical Reports Server (NTRS)

    Masters, W. C.; Scheeres, D. J.; Thurman, S. W.

    1994-01-01

    The theoretical aspects of an orbit determination filter that incorporates ground-system error sources as model parameters for use in interplanetary navigation are presented in this article. This filter, which is derived from sequential filtering theory, allows a systematic treatment of errors in calibrations of transmission media, station locations, and earth orientation models associated with ground-based radio metric data, in addition to the modeling of the spacecraft dynamics. The discussion includes a mathematical description of the filter and an analytical comparison of its characteristics with more traditional filtering techniques used in this application. The analysis in this article shows that this filter has the potential to generate navigation products of substantially greater accuracy than more traditional filtering procedures.

  20. Sex preference and third birth intervals in a traditional Indian society.

    PubMed

    Nath, D C; Land, K C

    1994-07-01

    The traditional preference for sons may be the main hindrance to India's current population policy of two children per family. In this study, the effects of various sociodemographic covariates, particularly sex preference, on the length of the third birth interval are examined for the scheduled caste population in Assam, India. Life table and hazards regression techniques are applied to retrospective sample data. The analysis shows that couples having two surviving sons are less likely to have a third child than those without a surviving son and those with only one surviving son. Age at first marriage, length of preceding birth intervals, age of mother, and household income have strong effects on the length of the third birth interval.

  1. Morphometric Identification of Queens, Workers and Intermediates in In Vitro Reared Honey Bees (Apis mellifera).

    PubMed

    De Souza, Daiana A; Wang, Ying; Kaftanoglu, Osman; De Jong, David; Amdam, Gro V; Gonçalves, Lionel S; Francoy, Tiago M

    2015-01-01

    In vitro rearing is an important and useful tool for honey bee (Apis mellifera L.) studies. However, it often results in intercastes between queens and workers, which are normally are not seen in hive-reared bees, except when larvae older than three days are grafted for queen rearing. Morphological classification (queen versus worker or intercastes) of bees produced by this method can be subjective and generally depends on size differences. Here, we propose an alternative method for caste classification of female honey bees reared in vitro, based on weight at emergence, ovariole number, spermatheca size and size and shape, and features of the head, mandible and basitarsus. Morphological measurements were made with both traditional morphometric and geometric morphometrics techniques. The classifications were performed by principal component analysis, using naturally developed queens and workers as controls. First, the analysis included all the characters. Subsequently, a new analysis was made without the information about ovariole number and spermatheca size. Geometric morphometrics was less dependent on ovariole number and spermatheca information for caste and intercaste identification. This is useful, since acquiring information concerning these reproductive structures requires time-consuming dissection and they are not accessible when abdomens have been removed for molecular assays or in dried specimens. Additionally, geometric morphometrics divided intercastes into more discrete phenotype subsets. We conclude that morphometric geometrics are superior to traditional morphometrics techniques for identification and classification of honey bee castes and intermediates.

  2. Detection of Low Molecular Weight Adulterants in Beverages by Direct Analysis in Real Time Mass Spectrometry.

    PubMed

    Sisco, Edward; Dake, Jeffrey

    2016-04-14

    Direct Analysis in Real Time Mass Spectrometry (DART-MS) has been used to detect the presence of non-narcotic adulterants in beverages. The non-narcotic adulterants that were examined in this work incorporated a number low molecular weight alcohols, acetone, ammonium hydroxide, and sodium hypochlorite. Analysis of the adulterants was completed by pipetting 1 µL deposits onto glass microcapillaries along with an appropriate dopant species followed by introduction into the DART gas stream. It was found that detection of these compounds in the complex matrices of common beverages (soda, energy drinks, etc.) was simplified through the use of a dopant species to allow for adduct formation with the desired compound(s) of interest. Other parameters that were investigated included DART gas stream temperature, in source collision induced dissociation, ion polarity, and DART needle voltage. Sensitivities of the technique were found to range from 0.001 % volume fraction to 0.1 % volume fraction, comparable to traditional analyses completed using headspace gas chromatography mass spectrometry (HS-GC/MS). Once a method was established using aqueous solutions, , fifteen beverages were spiked with each of the nine adulterants, to simulate real world detection, and in nearly all cases the adulterant could be detected either in pure form, or complexed with the added dopant species. This technique provides a rapid way to directly analyze beverages believed to be contaminated with non-narcotic adulterants at sensitivities similar to or exceeding those of traditional confirmatory analyses.

  3. [Analysis of triterpenoids in Ganoderma lucidum by microwave-assisted continuous extraction].

    PubMed

    Lu, Yan-fang; An, Jing; Jiang, Ye

    2015-04-01

    For further improving the extraction efficiency of microwave extraction, a microwave-assisted contijuous extraction (MACE) device has been designed and utilized. By contrasting with the traditional methods, the characteristics and extraction efficiency of MACE has also been studied. The method was validated by the analysis of the triterpenoids in Ganoderma lucidum. The extraction conditions of MACE were: using 95% ethanol as solvent, microwave power 200 W and radiation time 14.5 min (5 cycles). The extraction results were subsequently compared with traditional heat reflux extraction ( HRE) , soxhlet extraction (SE), ultrasonic extraction ( UE) as well as the conventional microwave extraction (ME). For triterpenoids, the two methods based on the microwaves (ME and MACE) were in general capable of finishing the extraction in 10, 14.5 min, respectively, while other methods should consume 60 min and even more than 100 min. Additionally, ME can produce comparable extraction results as the classical HRE and higher extraction yield than both SE and UE, however, notably lower extraction yield than MASE. More importantly, the purity of the crud extract by MACE is far better than the other methods. MACE can effectively combine the advantages of microwave extraction and soxhlet extraction, thus enabling a more complete extraction of the analytes of TCMs in comparison with ME. And therefore makes the analytic result more accurate. It provides a novel, high efficient, rapid and reliable pretreatment technique for the analysis of TCMs, and it could potentially be extended to ingredient preparation or extracting techniques of TCMs.

  4. Morphometric Identification of Queens, Workers and Intermediates in In Vitro Reared Honey Bees (Apis mellifera)

    PubMed Central

    A. De Souza, Daiana; Wang, Ying; Kaftanoglu, Osman; De Jong, David; V. Amdam, Gro; S. Gonçalves, Lionel; M. Francoy, Tiago

    2015-01-01

    In vitro rearing is an important and useful tool for honey bee (Apis mellifera L.) studies. However, it often results in intercastes between queens and workers, which are normally are not seen in hive-reared bees, except when larvae older than three days are grafted for queen rearing. Morphological classification (queen versus worker or intercastes) of bees produced by this method can be subjective and generally depends on size differences. Here, we propose an alternative method for caste classification of female honey bees reared in vitro, based on weight at emergence, ovariole number, spermatheca size and size and shape, and features of the head, mandible and basitarsus. Morphological measurements were made with both traditional morphometric and geometric morphometrics techniques. The classifications were performed by principal component analysis, using naturally developed queens and workers as controls. First, the analysis included all the characters. Subsequently, a new analysis was made without the information about ovariole number and spermatheca size. Geometric morphometrics was less dependent on ovariole number and spermatheca information for caste and intercaste identification. This is useful, since acquiring information concerning these reproductive structures requires time-consuming dissection and they are not accessible when abdomens have been removed for molecular assays or in dried specimens. Additionally, geometric morphometrics divided intercastes into more discrete phenotype subsets. We conclude that morphometric geometrics are superior to traditional morphometrics techniques for identification and classification of honey bee castes and intermediates. PMID:25894528

  5. Contact thermal shock test of ceramics

    NASA Technical Reports Server (NTRS)

    Rogers, W. P.; Emery, A. F.

    1992-01-01

    A novel quantitative thermal shock test of ceramics is described. The technique employs contact between a metal-cooling rod and hot disk-shaped specimen. In contrast with traditional techniques, the well-defined thermal boundary condition allows for accurate analyses of heat transfer, stress, and fracture. Uniform equibiaxial tensile stresses are induced in the center of the test specimen. Transient specimen temperature and acoustic emission are monitored continuously during the thermal stress cycle. The technique is demonstrated with soda-lime glass specimens. Experimental results are compared with theoretical predictions based on a finite-element method thermal stress analysis combined with a statistical model of fracture. Material strength parameters are determined using concentric ring flexure tests. Good agreement is found between experimental results and theoretical predictions of failure probability as a function of time and initial specimen temperature.

  6. Analysis of multiple instructional techniques on the understanding and retention of select mechanical topics

    NASA Astrophysics Data System (ADS)

    Fetsco, Sara Elizabeth

    There are several topics that introductory physics students typically have difficulty understanding. The purpose of this thesis is to investigate if multiple instructional techniques will help students to better understand and retain the material. The three units analyzed in this study are graphing motion, projectile motion, and conservation of momentum. For each unit students were taught using new or altered instructional methods including online laboratory simulations, inquiry labs, and interactive demonstrations. Additionally, traditional instructional methods such as lecture and problem sets were retained. Effectiveness was measured through pre- and post-tests and student opinion surveys. Results suggest that incorporating multiple instructional techniques into teaching will improve student understanding and retention. Students stated that they learned well from all of the instructional methods used except the online simulations.

  7. Free Flow Zonal Electrophoresis for Fractionation of Plant Membrane Compartments Prior to Proteomic Analysis.

    PubMed

    Barkla, Bronwyn J

    2018-01-01

    Free flow zonal electrophoresis (FFZE) is a versatile, reproducible, and potentially high-throughput technique for the separation of plant organelles and membranes by differences in membrane surface charge. It offers considerable benefits over traditional fractionation techniques, such as density gradient centrifugation and two-phase partitioning, as it is relatively fast, sample recovery is high, and the method provides unparalleled sample purity. It has been used to successfully purify chloroplasts and mitochondria from plants but also, to obtain highly pure fractions of plasma membrane, tonoplast, ER, Golgi, and thylakoid membranes. Application of the technique can significantly improve protein coverage in large-scale proteomics studies by decreasing sample complexity. Here, we describe the method for the fractionation of plant cellular membranes from leaves by FFZE.

  8. Improvement of Soybean Products Through the Response Mechanism Analysis Using Proteomic Technique.

    PubMed

    Wang, Xin; Komatsu, Setsuko

    Soybean is rich in protein/vegetable oil and contains several phytochemicals such as isoflavones and phenolic compounds. Because of the predominated nutritional values, soybean is considered as traditional health benefit food. Soybean is a widely cultivated crop; however, its growth and yield are markedly affected by adverse environmental conditions. Proteomic techniques make it feasible to map protein profiles both during soybean growth and under unfavorable conditions. The stress-responsive mechanisms during soybean growth have been uncovered with the help of proteomic studies. In this review, the history of soybean as food and the morphology/physiology of soybean are described. The utilization of proteomics during soybean germination and development is summarized. In addition, the stress-responsive mechanisms explored using proteomic techniques are reviewed in soybean. © 2017 Elsevier Inc. All rights reserved.

  9. Approach for gait analysis in persons with limb loss including residuum and prosthesis socket dynamics.

    PubMed

    LaPrè, A K; Price, M A; Wedge, R D; Umberger, B R; Sup, Frank C

    2018-04-01

    Musculoskeletal modeling and marker-based motion capture techniques are commonly used to quantify the motions of body segments, and the forces acting on them during human gait. However, when these techniques are applied to analyze the gait of people with lower limb loss, the clinically relevant interaction between the residual limb and prosthesis socket is typically overlooked. It is known that there is considerable motion and loading at the residuum-socket interface, yet traditional gait analysis techniques do not account for these factors due to the inability to place tracking markers on the residual limb inside of the socket. In the present work, we used a global optimization technique and anatomical constraints to estimate the motion and loading at the residuum-socket interface as part of standard gait analysis procedures. We systematically evaluated a range of parameters related to the residuum-socket interface, such as the number of degrees of freedom, and determined the configuration that yields the best compromise between faithfully tracking experimental marker positions while yielding anatomically realistic residuum-socket kinematics and loads that agree with data from the literature. Application of the present model to gait analysis for people with lower limb loss will deepen our understanding of the biomechanics of walking with a prosthesis, which should facilitate the development of enhanced rehabilitation protocols and improved assistive devices. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Recent patents of nanopore DNA sequencing technology: progress and challenges.

    PubMed

    Zhou, Jianfeng; Xu, Bingqian

    2010-11-01

    DNA sequencing techniques witnessed fast development in the last decades, primarily driven by the Human Genome Project. Among the proposed new techniques, Nanopore was considered as a suitable candidate for the single DNA sequencing with ultrahigh speed and very low cost. Several fabrication and modification techniques have been developed to produce robust and well-defined nanopore devices. Many efforts have also been done to apply nanopore to analyze the properties of DNA molecules. By comparing with traditional sequencing techniques, nanopore has demonstrated its distinctive superiorities in main practical issues, such as sample preparation, sequencing speed, cost-effective and read-length. Although challenges still remain, recent researches in improving the capabilities of nanopore have shed a light to achieve its ultimate goal: Sequence individual DNA strand at single nucleotide level. This patent review briefly highlights recent developments and technological achievements for DNA analysis and sequencing at single molecule level, focusing on nanopore based methods.

  11. Efficient computational nonlinear dynamic analysis using modal modification response technique

    NASA Astrophysics Data System (ADS)

    Marinone, Timothy; Avitabile, Peter; Foley, Jason; Wolfson, Janet

    2012-08-01

    Generally, structural systems contain nonlinear characteristics in many cases. These nonlinear systems require significant computational resources for solution of the equations of motion. Much of the model, however, is linear where the nonlinearity results from discrete local elements connecting different components together. Using a component mode synthesis approach, a nonlinear model can be developed by interconnecting these linear components with highly nonlinear connection elements. The approach presented in this paper, the Modal Modification Response Technique (MMRT), is a very efficient technique that has been created to address this specific class of nonlinear problem. By utilizing a Structural Dynamics Modification (SDM) approach in conjunction with mode superposition, a significantly smaller set of matrices are required for use in the direct integration of the equations of motion. The approach will be compared to traditional analytical approaches to make evident the usefulness of the technique for a variety of test cases.

  12. Reirradiation of head and neck cancer using modern highly conformal techniques.

    PubMed

    Ho, Jennifer C; Phan, Jack

    2018-04-23

    Locoregional disease recurrence or development of a second primary cancer after definitive radiotherapy for head and neck cancers remains a treatment challenge. Reirradiation utilizing traditional techniques has been limited by concern for serious toxicity. With the advent of newer, more precise radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT), proton radiotherapy, and stereotactic body radiotherapy (SBRT), there has been renewed interest in curative-intent head and neck reirradiation. However, as most studies were retrospective, single-institutional experiences, the optimal modality is not clear. We provide a comprehensive review of the outcomes of relevant studies using these 3 head and neck reirradiation techniques, followed by an analysis and comparison of the toxicity, tumor control, concurrent systemic therapy, and prognostic factors. Overall, there is evidence that IMRT, proton therapy, and SBRT reirradiation are feasible treatment options that offer a chance for durable local control and survival. Prospective studies, particularly randomized trials, are needed. © 2018 Wiley Periodicals, Inc.

  13. Component Pin Recognition Using Algorithms Based on Machine Learning

    NASA Astrophysics Data System (ADS)

    Xiao, Yang; Hu, Hong; Liu, Ze; Xu, Jiangchang

    2018-04-01

    The purpose of machine vision for a plug-in machine is to improve the machine’s stability and accuracy, and recognition of the component pin is an important part of the vision. This paper focuses on component pin recognition using three different techniques. The first technique involves traditional image processing using the core algorithm for binary large object (BLOB) analysis. The second technique uses the histogram of oriented gradients (HOG), to experimentally compare the effect of the support vector machine (SVM) and the adaptive boosting machine (AdaBoost) learning meta-algorithm classifiers. The third technique is the use of an in-depth learning method known as convolution neural network (CNN), which involves identifying the pin by comparing a sample to its training. The main purpose of the research presented in this paper is to increase the knowledge of learning methods used in the plug-in machine industry in order to achieve better results.

  14. A novel pretreatment method combining sealing technique with direct injection technique applied for improving biosafety.

    PubMed

    Wang, Xinyu; Gao, Jing-Lin; Du, Chaohui; An, Jing; Li, MengJiao; Ma, Haiyan; Zhang, Lina; Jiang, Ye

    2017-01-01

    People today have a stronger interest in the risk of biosafety in clinical bioanalysis. A safe, simple, effective method of preparation is needed urgently. To improve biosafety of clinical analysis, we used antiviral drugs of adefovir and tenofovir as model drugs and developed a safe pretreatment method combining sealing technique with direct injection technique. The inter- and intraday precision (RSD %) of the method were <4%, and the extraction recoveries ranged from 99.4 to 100.7%. Meanwhile, the results showed that standard solution could be used to prepare calibration curve instead of spiking plasma, acquiring more accuracy result. Compared with traditional methods, the novel method not only improved biosecurity of the pretreatment method significantly, but also achieved several advantages including higher precision, favorable sensitivity and satisfactory recovery. With these highly practical and desirable characteristics, the novel method may become a feasible platform in bioanalysis.

  15. Techniques for measurement of thoracoabdominal asynchrony

    NASA Technical Reports Server (NTRS)

    Prisk, G. Kim; Hammer, J.; Newth, Christopher J L.

    2002-01-01

    Respiratory motion measured by respiratory inductance plethysmography often deviates from the sinusoidal pattern assumed in the traditional Lissajous figure (loop) analysis used to determine thoraco-abdominal asynchrony, or phase angle phi. We investigated six different time-domain methods of measuring phi, using simulated data with sinusoidal and triangular waveforms, phase shifts of 0-135 degrees, and 10% noise. The techniques were then used on data from 11 lightly anesthetized rhesus monkeys (Macaca mulatta; 7.6 +/- 0.8 kg; 5.7 +/- 0.5 years old), instrumented with a respiratory inductive plethysmograph, and subjected to increasing levels of inspiratory resistive loading ranging from 5-1,000 cmH(2)O. L(-1). sec(-1).The best results were obtained from cross-correlation and maximum linear correlation, with errors less than approximately 5 degrees from the actual phase angle in the simulated data. The worst performance was produced by the loop analysis, which in some cases was in error by more than 30 degrees. Compared to correlation, other analysis techniques performed at an intermediate level. Maximum linear correlation and cross-correlation produced similar results on the data collected from monkeys (SD of the difference, 4.1 degrees ) but all other techniques had a high SD of the difference compared to the correlation techniques.We conclude that phase angles are best measured using cross-correlation or maximum linear correlation, techniques that are independent of waveform shape, and robust in the presence of noise. Copyright 2002 Wiley-Liss, Inc.

  16. Kinematic Labs with Mobile Devices

    NASA Astrophysics Data System (ADS)

    Kinser, Jason M.

    2015-07-01

    This book provides 13 labs spanning the common topics in the first semester of university-level physics. Each lab is designed to use only the student's smartphone, laptop and items easily found in big-box stores or a hobby shop. Each lab contains theory, set-up instructions and basic analysis techniques. All of these labs can be performed outside of the traditional university lab setting and initial costs averaging less than 8 per student, per lab.

  17. Space station advanced automation

    NASA Technical Reports Server (NTRS)

    Woods, Donald

    1990-01-01

    In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.

  18. Pay-off-biased social learning underlies the diffusion of novel extractive foraging traditions in a wild primate

    PubMed Central

    2017-01-01

    The type and variety of learning strategies used by individuals to acquire behaviours in the wild are poorly understood, despite the presence of behavioural traditions in diverse taxa. Social learning strategies such as conformity can be broadly adaptive, but may also retard the spread of adaptive innovations. Strategies like pay-off-biased learning, by contrast, are effective at diffusing new behaviour but may perform poorly when adaptive behaviour is common. We present a field experiment in a wild primate, Cebus capucinus, that introduced a novel food item and documented the innovation and diffusion of successful extraction techniques. We develop a multilevel, Bayesian statistical analysis that allows us to quantify individual-level evidence for different social and individual learning strategies. We find that pay-off-biased and age-biased social learning are primarily responsible for the diffusion of new techniques. We find no evidence of conformity; instead rare techniques receive slightly increased attention. We also find substantial and important variation in individual learning strategies that is patterned by age, with younger individuals being more influenced by both social information and their own individual experience. The aggregate cultural dynamics in turn depend upon the variation in learning strategies and the age structure of the wild population. PMID:28592681

  19. A hierarchical network-based algorithm for multi-scale watershed delineation

    NASA Astrophysics Data System (ADS)

    Castronova, Anthony M.; Goodall, Jonathan L.

    2014-11-01

    Watershed delineation is a process for defining a land area that contributes surface water flow to a single outlet point. It is a commonly used in water resources analysis to define the domain in which hydrologic process calculations are applied. There has been a growing effort over the past decade to improve surface elevation measurements in the U.S., which has had a significant impact on the accuracy of hydrologic calculations. Traditional watershed processing on these elevation rasters, however, becomes more burdensome as data resolution increases. As a result, processing of these datasets can be troublesome on standard desktop computers. This challenge has resulted in numerous works that aim to provide high performance computing solutions to large data, high resolution data, or both. This work proposes an efficient watershed delineation algorithm for use in desktop computing environments that leverages existing data, U.S. Geological Survey (USGS) National Hydrography Dataset Plus (NHD+), and open source software tools to construct watershed boundaries. This approach makes use of U.S. national-level hydrography data that has been precomputed using raster processing algorithms coupled with quality control routines. Our approach uses carefully arranged data and mathematical graph theory to traverse river networks and identify catchment boundaries. We demonstrate this new watershed delineation technique, compare its accuracy with traditional algorithms that derive watershed solely from digital elevation models, and then extend our approach to address subwatershed delineation. Our findings suggest that the open-source hierarchical network-based delineation procedure presented in the work is a promising approach to watershed delineation that can be used summarize publicly available datasets for hydrologic model input pre-processing. Through our analysis, we explore the benefits of reusing the NHD+ datasets for watershed delineation, and find that the our technique offers greater flexibility and extendability than traditional raster algorithms.

  20. Review on Microstructure Analysis of Metals and Alloys Using Image Analysis Techniques

    NASA Astrophysics Data System (ADS)

    Rekha, Suganthini; Bupesh Raja, V. K.

    2017-05-01

    The metals and alloys find vast application in engineering and domestic sectors. The mechanical properties of the metals and alloys are influenced by their microstructure. Hence the microstructural investigation is very critical. Traditionally the microstructure is studied using optical microscope with suitable metallurgical preparation. The past few decades the computers are applied in the capture and analysis of the optical micrographs. The advent of computer softwares like digital image processing and computer vision technologies are a boon to the analysis of the microstructure. In this paper the literature study of the various developments in the microstructural analysis, is done. The conventional optical microscope is complemented by the use of Scanning Electron Microscope (SEM) and other high end equipments.

  1. Traditional living and cultural ways as protective factors against suicide: perceptions of Alaska Native university students.

    PubMed

    DeCou, Christopher R; Skewes, Monica C; López, Ellen D S

    2013-01-01

    Native peoples living in Alaska have one of the highest rates of suicide in the world. This represents a significant health disparity for indigenous populations living in Alaska. This research was part of a larger study that explored qualitatively the perceptions of Alaska Native university students from rural communities regarding suicide. This analysis explored the resilience that arose from participants' experiences of traditional ways, including subsistence activities. Previous research has indicated the importance of traditional ways in preventing suicide and strengthening communities. Semi-structured interviews were conducted with 25 university students who had migrated to Fairbanks, Alaska, from rural Alaskan communities. An interview protocol was developed in collaboration with cultural and community advisors. Interviews were audio-recorded and transcribed. Participants were asked specific questions concerning the strengthening of traditional practices towards the prevention of suicide. Transcripts were analysed using the techniques of grounded theory. Participants identified several resilience factors against suicide, including traditional practices and subsistence activities, meaningful community involvement and an active lifestyle. Traditional practices and subsistence activities were perceived to create the context for important relationships, promote healthy living to prevent suicide, contrast with current challenges and transmit important cultural values. Participants considered the strengthening of these traditional ways as important in suicide prevention efforts. However, subsistence and traditional practices were viewed as a diminishing aspect of daily living in rural Alaska. Many college students from rural Alaska have been affected by suicide but are strong enough to cope with such tragic events. Subsistence living and traditional practices were perceived as important social and cultural processes with meaningful lifelong benefits for participants. Future research should continue to explore the ways in which traditional practices can contribute towards suicide prevention, as well as the far-reaching benefits of subsistence living.

  2. ConceFT for Time-Varying Heart Rate Variability Analysis as a Measure of Noxious Stimulation During General Anesthesia.

    PubMed

    Lin, Yu-Ting; Wu, Hau-Tieng

    2017-01-01

    Heart rate variability (HRV) offers a noninvasive way to peek into the physiological status of the human body. When this physiological status is dynamic, traditional HRV indices calculated from power spectrum do not resolve the dynamic situation due to the issue of nonstationarity. Clinical anesthesia is a typically dynamic situation that calls for time-varying HRV analysis. Concentration of frequency and time (ConceFT) is a nonlinear time-frequency (TF) analysis generalizing the multitaper technique and the synchrosqueezing transform. The result is a sharp TF representation capturing the dynamics inside HRV. Companion indices of the commonly applied HRV indices, including time-varying low-frequency power (tvLF), time-varying high-frequency power, and time-varying low-high ratio, are considered as measures of noxious stimulation. To evaluate the feasibility of the proposed indices, we apply these indices to study two different types of noxious stimulation, the endotracheal intubation and surgical skin incision, under general anesthesia. The performance was compared with traditional HRV indices, the heart rate reading, and indices from electroencephalography. The results indicate that the tvLF index performs best and outperforms not only the traditional HRV index, but also the commonly used heart rate reading. With the help of ConceFT, the proposed HRV indices are potential to provide a better quantification of the dynamic change of the autonomic nerve system. Our proposed scheme of time-varying HRV analysis could contribute to the clinical assessment of analgesia under general anesthesia.

  3. Rapid detection of Escherichia coli and enterococci in recreational water using an immunomagnetic separation/adenosine triphosphate technique

    USGS Publications Warehouse

    Bushon, R.N.; Brady, A.M.; Likirdopulos, C.A.; Cireddu, J.V.

    2009-01-01

    Aims: The aim of this study was to examine a rapid method for detecting Escherichia coli and enterococci in recreational water. Methods and Results: Water samples were assayed for E. coli and enterococci by traditional and immunomagnetic separation/adenosine triphosphate (IMS/ATP) methods. Three sample treatments were evaluated for the IMS/ATP method: double filtration, single filtration, and direct analysis. Pearson's correlation analysis showed strong, significant, linear relations between IMS/ATP and traditional methods for all sample treatments; strongest linear correlations were with the direct analysis (r = 0.62 and 0.77 for E. coli and enterococci, respectively). Additionally, simple linear regression was used to estimate bacteria concentrations as a function of IMS/ATP results. The correct classification of water-quality criteria was 67% for E. coli and 80% for enterococci. Conclusions: The IMS/ATP method is a viable alternative to traditional methods for faecal-indicator bacteria. Significance and Impact of the Study: The IMS/ATP method addresses critical public health needs for the rapid detection of faecal-indicator contamination and has potential for satisfying US legislative mandates requiring methods to detect bathing water contamination in 2 h or less. Moreover, IMS/ATP equipment is considerably less costly and more portable than that for molecular methods, making the method suitable for field applications. ?? 2009 The Authors.

  4. How we process trephine biopsy specimens: epoxy resin embedded bone marrow biopsies

    PubMed Central

    Krenacs, T; Bagdi, E; Stelkovics, E; Bereczki, L; Krenacs, L

    2005-01-01

    Improved cytomorphology of semithin resin sections over paraffin wax embedded sections may be important in diagnostic haematopathology. However, resin embedding can make immunohistochemical antigen detection or DNA isolation for clonal gene rearrangement assays difficult. This review describes the processing of bone marrow biopsies using buffered formaldehyde based fixation and epoxy resin embedding, with or without EDTA decalcification. Traditional semithin resin sections are completely rehydrated after etching in home made sodium methoxide solution. Resin elimination allows high resolution staining of tissue components with common histological stains. Efficient antigen retrieval and the Envision-HRP system permit the immunohistological detection of many antigens of diagnostic relevance, with retention of high quality cytomorphology. Furthermore, DNA can be extracted for clonality analysis. The technique can be completed within a similar time period to that of paraffin wax processing with only ∼30% increase in cost. This technique has been used for diagnosis in over 4000 bone marrow biopsies over the past 14 years. By meeting traditional and contemporary demands on the haematopathologist, it offers a powerful alternative to paraffin wax processing for diagnosis and research. PMID:16126867

  5. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  6. Rapid detection of illegal colorants on traditional Chinese pastries through mass spectrometry with an interchangeable thermal desorption electrospray ionization source.

    PubMed

    Chao, Yu-Ying; Chen, Yen-Ling; Chen, Wei-Chu; Chen, Bai-Hsiun; Huang, Yeou-Lih

    2018-06-30

    Ambient mass spectrometry using an interchangeable thermal desorption/electrospray ionization source (TD-ESI) is a relatively new technique that has had only a limited number of applications to date. Nevertheless, this direct-analysis technique has potential for wider use in analytical chemistry (e.g., in the rapid direct detection of contaminants, residues, and adulterants on and in food) when operated in dual-working mode (pretreatment-free qualitative screening and conventional quantitative confirmation) after switching to a TD-ESI source from a conventional ESI source. Herein, we describe the benefits and challenges associated with the use of a TD-ESI source to detect adulterants on traditional Chinese pastries (TCPs), as a proof-of-concept for the detection of illegal colorants. While TD-ESI can offer direct (i.e., without any sample preparation) qualitative screening analyses for TCPs with adequate sensitivity within 30 s, the use of TD-ESI for semi-quantification is applicable only for homogeneous matrices (e.g., tang yuan). Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Fiber laser welding of dual-phase galvanized sheet steel (DP590): traditional analysis and new quality assessment techniques

    NASA Astrophysics Data System (ADS)

    Miller, Stephanie; Pfeif, Erik; Kazakov, Andrei; Baumann, Esther; Dowell, Marla

    2016-03-01

    Laser welding has many advantages over traditional joining methods, yet remains underutilized. NIST has undertaken an ambitious initiative to improve predictions of weldability, reliability, and performance of laser welds. This study investigates butt welding of galvanized and ungalvanized dual-phase automotive sheet steels (DP 590) using a 10 kW commercial fiber laser system. Parameter development work, hardness profiles, microstructural characterization, and optical profilometry results are presented. Sound welding was accomplished in a laser power range of 2.0 kW to 4.5 kW and travel speed of 2000 mm/min to 5000 mm/min. Vickers hardness ranged from approximately 2 GPa to 4 GPa across the welds, with limited evidence of heat affected zone softening. Decreased hardness across the heat affected zone directly correlated to the appearance of ferrite. A technique was developed to non-destructively evaluate weld quality based on geometrical criteria. Weld face profilometry data were compared between light optical, metallographic sample, and frequency-modulated continuous-wave laser detection and ranging (FMCW LADAR) methods.

  8. Residential roof condition assessment system using deep learning

    NASA Astrophysics Data System (ADS)

    Wang, Fan; Kerekes, John P.; Xu, Zhuoyi; Wang, Yandong

    2018-01-01

    The emergence of high resolution (HR) and ultra high resolution (UHR) airborne remote sensing imagery is enabling humans to move beyond traditional land cover analysis applications to the detailed characterization of surface objects. A residential roof condition assessment method using techniques from deep learning is presented. The proposed method operates on individual roofs and divides the task into two stages: (1) roof segmentation, followed by (2) condition classification of the segmented roof regions. As the first step in this process, a self-tuning method is proposed to segment the images into small homogeneous areas. The segmentation is initialized with simple linear iterative clustering followed by deep learned feature extraction and region merging, with the optimal result selected by an unsupervised index, Q. After the segmentation, a pretrained residual network is fine-tuned on the augmented roof segments using a proposed k-pixel extension technique for classification. The effectiveness of the proposed algorithm was demonstrated on both HR and UHR imagery collected by EagleView over different study sites. The proposed algorithm has yielded promising results and has outperformed traditional machine learning methods using hand-crafted features.

  9. Ultrasound Guidance for Botulinum Neurotoxin Chemodenervation Procedures.

    PubMed

    Alter, Katharine E; Karp, Barbara I

    2017-12-28

    Injections of botulinum neurotoxins (BoNTs) are prescribed by clinicians for a variety of disorders that cause over-activity of muscles; glands; pain and other structures. Accurately targeting the structure for injection is one of the principle goals when performing BoNTs procedures. Traditionally; injections have been guided by anatomic landmarks; palpation; range of motion; electromyography or electrical stimulation. Ultrasound (US) based imaging based guidance overcomes some of the limitations of traditional techniques. US and/or US combined with traditional guidance techniques is utilized and or recommended by many expert clinicians; authors and in practice guidelines by professional academies. This article reviews the advantages and disadvantages of available guidance techniques including US as well as technical aspects of US guidance and a focused literature review related to US guidance for chemodenervation procedures including BoNTs injection.

  10. Yeast microbiota associated with spontaneous sourdough fermentations in the production of traditional wheat sourdough breads of the Abruzzo region (Italy).

    PubMed

    Valmorri, Sara; Tofalo, Rosanna; Settanni, Luca; Corsetti, Aldo; Suzzi, Giovanna

    2010-02-01

    The aims of this study were to describe the yeast community of 20 sourdoughs collected from central Italy and to characterize the sourdoughs based on chemical properties. A polyphasic approach consisting of traditional culture-based tests (spore-forming and physiological tests) and molecular techniques (PCR-RFLP, RAPD-PCR, PCR-DGGE) and chemical analysis (total acidity, acids, and sugar contents), was utilized to describe the yeast population and to investigate the chemical composition of the doughs. PCR-RFLP analysis identified 85% of the isolates as Saccharomyces cerevisiae, with the other dominant species being Candida milleri (11%), Candida krusei (2.5%), and Torulaspora delbrueckii (1%). RAPD-PCR analysis, performed with primers M13 and LA1, highlighted intraspecific polymorphism among the S. cerevisiae strains. The diversity of the sourdoughs from the Abruzzo region is reflected in the chemical composition, yeast species, and strain polymorphism. Our approach using a combination of phenotypic and genotypic methods identified the yeast species in the 20 sourdough samples and provided a complete overview of the yeast populations found in sourdoughs from the Abruzzo region.

  11. Measuring mercury and other elemental components in tree rings

    USGS Publications Warehouse

    Gillan, C.; Hollerman, W.A.; Doyle, T.W.; Lewis, T.E.

    2004-01-01

    There has been considerable interest in measuring heavy metal pollution, such as mercury, using tree ring analysis. Since 1970, this method has provided a historical snapshot of pollutant concentrations near hazardous waste sites. Traditional methods of analysis have long been used with heavy metal pollutants such as mercury. These methods, such as atomic fluorescence and laser ablation, are sometimes time consuming and expensive to implement. In recent years, ion beam techniques, such as Particle Induced X-Ray Emission (PIXE), have been used to measure large numbers of elements. Most of the existing research in this area has been completed for low to medium atomic number pollutants, such as titanium, cobalt, nickel, and copper. Due to the reduction of sensitivity, it is often difficult or impossible to use traditional low energy (few MeV) PIXE analysis for pollutants with large atomic numbers. For example, the PIXE detection limit for mercury was recently measured to be about 1 ppm for a spiked Southern Magnolia wood sample [ref. 1]. This presentation will compare PIXE and standard chemical concentration results for a variety of wood samples.

  12. Measuring mercury and other elemental components in tree rings

    USGS Publications Warehouse

    Gillan, C.; Hollerman, W.A.; Doyle, T.W.; Lewis, T.E.

    2004-01-01

    There has been considerable interest in measuring heavy metal pollution, such as mercury, using tree ring analysis. Since 1970, this method has provided a historical snapshot of pollutant concentrations near hazardous waste sites. Traditional methods of analysis have long been used with heavy metal pollutants such as mercury. These methods, such as atomic fluorescence and laser ablation, are sometimes time consuming and expensive to implement. In recent years, ion beam techniques, such as Particle Induced X-Ray Emission (PIXE), have been used to measure large numbers of elements. Most of the existing research in this area has been completed for low to medium atomic number pollutants, such as titanium, cobalt, nickel, and copper. Due to the reduction of sensitivity, it is often difficult or impossible to use traditional low energy (few MeV) PIXE analysis for pollutants with large atomic numbers. For example, the PIXE detection limit for mercury was recently measured to be about 1 ppm for a spiked Southern Magnolia wood sample [ref. 1]. This presentation will compare PIXE and standard chemical concentration results for a variety of wood samples. Copyright 2004 by ISA.

  13. The application analysis of the multi-angle polarization technique for ocean color remote sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli

    2017-02-01

    The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.

  14. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  15. Augmented reality telementoring (ART) platform: a randomized controlled trial to assess the efficacy of a new surgical education technology.

    PubMed

    Vera, Angelina M; Russo, Michael; Mohsin, Adnan; Tsuda, Shawn

    2014-12-01

    Laparoscopic skills training has evolved over recent years. However, conveying a mentor's directions using conventional methods, without realistic on-screen visual cues, can be difficult and confusing. To facilitate laparoscopic skill transference, an augmented reality telementoring (ART) platform was designed to overlay the instruments of a mentor onto the trainee's laparoscopic monitor. The aim of this study was to compare the effectiveness of this new teaching modality to traditional methods in novices performing an intracorporeal suturing task. Nineteen pre-medical and medical students were randomized into traditional mentoring (n = 9) and ART (n = 10) groups for a laparoscopic suturing and knot-tying task. Subjects received either traditional mentoring or ART for 1 h on the validated fundamentals of laparoscopic surgery intracorporeal suturing task. Tasks for suturing were recorded and scored for time and errors. Results were analyzed using means, standard deviation, power regression analysis, correlation coefficient, analysis of variance, and student's t test. Using Wright's cumulative average model (Y = aX (b)) the learning curve slope was significantly steeper, demonstrating faster skill acquisition, for the ART group (b = -0.567, r (2) = 0.92) than the control group (b = -0.453, r (2) = 0.74). At the end of 10 repetitions or 1 h of practice, the ART group was faster versus traditional (mean 167.4 vs. 242.4 s, p = 0.014). The ART group also had fewer fails (8) than the traditional group (13). The ART Platform may be a more effective training technique in teaching laparoscopic skills to novices compared to traditional methods. ART conferred a shorter learning curve, which was more pronounced in the first 4 trials. ART reduced the number of failed attempts and resulted in faster suture times by the end of the training session. ART may be a more effective training tool in laparoscopic surgical training for complex tasks than traditional methods.

  16. Kernel canonical-correlation Granger causality for multiple time series

    NASA Astrophysics Data System (ADS)

    Wu, Guorong; Duan, Xujun; Liao, Wei; Gao, Qing; Chen, Huafu

    2011-04-01

    Canonical-correlation analysis as a multivariate statistical technique has been applied to multivariate Granger causality analysis to infer information flow in complex systems. It shows unique appeal and great superiority over the traditional vector autoregressive method, due to the simplified procedure that detects causal interaction between multiple time series, and the avoidance of potential model estimation problems. However, it is limited to the linear case. Here, we extend the framework of canonical correlation to include the estimation of multivariate nonlinear Granger causality for drawing inference about directed interaction. Its feasibility and effectiveness are verified on simulated data.

  17. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    PubMed Central

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  18. Protected Designation of Origin (PDO), Protected Geographical Indication (PGI) and Traditional Speciality Guaranteed (TSG): A bibiliometric analysis.

    PubMed

    Dias, Claudia; Mendes, Luís

    2018-01-01

    Despite the importance of the literature on food quality labels in the European Union (PDO, PGI and TSG), our search did not find any review joining the various research topics on this subject. This study aims therefore to consolidate the state of academic research in this field, and so the methodological option was to elaborate a bibliometric analysis resorting to the term co-occurrence technique. Analysis was made of 501 articles on the ISI Web of Science database, covering publications up to 2016. The results of the bibliometric analysis allowed identification of four clusters: "Protected Geographical Indication", "Certification of Olive Oil and Cultivars", "Certification of Cheese and Milk" and "Certification and Chemical Composition". Unlike the other clusters, where the PDO label predominates, the "Protected Geographical Indication" cluster covers the study of PGI products, highlighting analysis of consumer behaviour in relation to this type of product. The focus of studies in the "Certification of Olive Oil and Cultivars" cluster and the "Certification of Cheese and Milk" cluster is the development of authentication methods for certified traditional products. In the "Certification and Chemical Composition" cluster, standing out is analysis of the profiles of fatty acids present in this type of product. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A Survey and Proposed Framework on the Soft Biometrics Technique for Human Identification in Intelligent Video Surveillance System

    PubMed Central

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing. PMID:22919273

  20. A survey and proposed framework on the soft biometrics technique for human identification in intelligent video surveillance system.

    PubMed

    Kim, Min-Gu; Moon, Hae-Min; Chung, Yongwha; Pan, Sung Bum

    2012-01-01

    Biometrics verification can be efficiently used for intrusion detection and intruder identification in video surveillance systems. Biometrics techniques can be largely divided into traditional and the so-called soft biometrics. Whereas traditional biometrics deals with physical characteristics such as face features, eye iris, and fingerprints, soft biometrics is concerned with such information as gender, national origin, and height. Traditional biometrics is versatile and highly accurate. But it is very difficult to get traditional biometric data from a distance and without personal cooperation. Soft biometrics, although featuring less accuracy, can be used much more freely though. Recently, many researchers have been made on human identification using soft biometrics data collected from a distance. In this paper, we use both traditional and soft biometrics for human identification and propose a framework for solving such problems as lighting, occlusion, and shadowing.

  1. Status of Thermal NDT of Space Shuttle Materials at NASA

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.

    2006-01-01

    Since the Space Shuttle Columbia accident, NASA has focused on improving advanced nondestructive evaluation (NDE) techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.

  2. Status of Thermal NDT of Space Shuttle Materials at NASA

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Reinhardt, Walter W.

    2007-01-01

    Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter s wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.

  3. Status of Thermal NDT of Space Shuttle Materials at NASA

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.; Hodges, Kenneth; Koshti, Ajay; Ryan, Daniel; Rweinhardt, Walter W.

    2006-01-01

    Since the Space Shuttle Columbia accident, NASA has focused on improving advanced NDE techniques for the Reinforced Carbon-Carbon (RCC) panels that comprise the orbiter's wing leading edge and nose cap. Various nondestructive inspection techniques have been used in the examination of the RCC, but thermography has emerged as an effective inspection alternative to more traditional methods. Thermography is a non-contact inspection method as compared to ultrasonic techniques which typically require the use of a coupling medium between the transducer and material. Like radiographic techniques, thermography can inspect large areas, but has the advantage of minimal safety concerns and the ability for single-sided measurements. Details of the analysis technique that has been developed to allow insitu inspection of a majority of shuttle RCC components is discussed. Additionally, validation testing, performed to quantify the performance of the system, will be discussed. Finally, the results of applying this technology to the Space Shuttle Discovery after its return from the STS-114 mission in July 2005 are discussed.

  4. A retrospective study of a modified 1-minute formocresol pulpotomy technique part 1: clinical and radiographic findings.

    PubMed

    Kurji, Zahra A; Sigal, Michael J; Andrews, Paul; Titley, Keith

    2011-01-01

    The purpose of this study was to assess the clinical and radiographic outcomes of a 1-minute application of full-strength Buckley's formocresol with concurrent hemostasis using the medicated cotton pledget in human primary teeth. Using a retrospective chart review, clinical and radiographic data were available for 557 primary molars in 320 patients. Descriptive statistics and survival analysis were used to assess outcomes. Overall clinical success, radiographic success, and cumulative 5-year survival rates were approximately 99%, 90%, and 87%, respectively. Internal root resorption (∼5%) and pulp canal obliteration (∼2%) were the most frequently observed radiographic failures. Thirty-nine teeth were extracted due to clinical and or radiographic failure. Mandibular molars were 6 times more prone to radiographic failure than maxillary molars. Success rates for the modified technique are comparable to techniques that use the 5-minute diluted or full-strength solutions reported in the literature. This 1-minute full-strength formocresol technique is an acceptable alternative to published traditional techniques.

  5. Mechanical Characterization of Nanoporous Thin Films by Nanoindentation and Laser-induced Surface Acoustic Waves

    NASA Astrophysics Data System (ADS)

    Chow, Gabriel

    Thin films represent a critical sector of modern engineering that strives to produce functional coatings at the smallest possible length scales. They appear most commonly in semiconductors where they form the foundation of all electronic circuits, but exist in many other areas to provide mechanical, electrical, chemical, and optical properties. The mechanical characterization of thin films has been a continued challenge due foremost to the length scales involved. However, emerging thin films focusing on materials with significant porosity, complex morphologies, and nanostructured surfaces produce additional difficulties towards mechanical analysis. Nanoindentation has been the dominant thin film mechanical characterization technique for the last decade because of the quick results, wide range of sample applicability, and ease of sample preparation. However, the traditional nanoindentation technique encounters difficulties for thin porous films. For such materials, alternative means of analysis are desirable and the lesser known laser-induced surface acoustic wave technique (LiSAW) shows great potential in this area. This dissertation focuses on studying thin, porous, and nanostructured films by nanoindentation and LiSAW techniques in an effort to directly correlate the two methodologies and to test the limits and applicabilities of each technique on challenging media. The LiSAW technique is particularly useful for thin porous films because unlike indentation, the substrate is properly accounted for in the wave motion analysis and no plastic deformation is necessary. Additionally, the use of lasers for surface acoustic wave generation and detection allows the technique to be fully non-contact. This is desirable in the measurement of thin, delicate, and porous films where physical sample probing may not be feasible. The LiSAW technique is also valuable in overcoming nanoscale roughness, particularly for films that cannot be mechanically polished, since typical SAW wavelengths are micrometers in scale whereas indentation depths are usually confined to the nanometer scale. This dissertation demonstrates the effectiveness of LiSAW on both thin porous layers and rough surfaces and shows the challenges faced by nanoindentation on the same films. Zeolite thin films are studied extensively in this work as a model system because of their porous crystalline framework and enormous economic market. Many types of zeolite exist and their widely varying structures and levels of porosity present a unique opportunity for mechanical characterization. For a fully dense ZSM-5 type zeolite with wear and corrosion resistance properties, nanoindentation was used to compare its mechanical properties to industrial chromium and cadmium films. Through tribological and indentation tests, it was shown that the zeolite film possesses exceptional resilience and hardness therefore demonstrating superior wear resistance to chromium and cadmium. This also highlighted the quality of nanoindentation measurements on thick dense layers where traditional nanoindentation excels. Nanoindentation was then performed on porous and non-porous MFI zeolite films with low-k (low dielectric constant) properties. These films were softer and much thinner than the ZSM-5 coatings resulting in significant substrate effects, evidenced by inflation of the measurements from the hard silicon substrate, during indentation. Such effects were avoided with the LiSAW technique on the same films where properties were readily extracted without complications. An alternative indentation analysis method was demonstrated to produce accurate mechanical measurements in line with the LiSAW results, but the non-traditional technique requires substantial computational intensity. Thus LiSAW was proven to be an accurate and efficient means of mechanical characterization for thin porous layers. The case for LiSAW was further supported by utilizing the technique on a porous nanostructured V2O5 electrode film. The surface roughness, on the same scale as indentation depths, created difficulty in obtaining consistent nanoindentation results. Since the film was too delicate for mechanical polishing, the nanoindentation results possessed a high level of uncertainty. It was demonstrated that the LiSAW technique could extract the mechanical properties from such layers without substrate effects and with higher accuracy than nanoindentation. The research in this dissertation directly demonstrates the areas where nanoindentation excels and the areas where it encounters difficulty. It is shown how the LiSAW technique can be an efficient alternative in the challenging areas through its dependence on bulk dispersive wave motion rather than localized deformation. Thus, LiSAW opens up many avenues towards the mechanical characterization of thin, porous, soft, or rough films. Nanoindentation remains an extremely useful technique for thin film characterization, especially with the alternative analysis adaptation. However, as films continue trending towards smaller length scales, more complex porous morphologies, and engineered nanoscale surfaces, LiSAW may well become an equally valuable and indispensable technique.

  6. Reconciling traditional knowledge, food security, and climate change: experience from Old Crow, YT, Canada.

    PubMed

    Douglas, Vasiliki; Chan, Hing Man; Wesche, Sonia; Dickson, Cindy; Kassi, Norma; Netro, Lorraine; Williams, Megan

    2014-01-01

    Because of a lack of transportation infrastructure, Old Crow has the highest food costs and greatest reliance on traditional food species for sustenance of any community in Canada's Yukon Territory. Environmental, cultural, and economic change are driving increased perception of food insecurity in Old Crow. To address community concerns regarding food security and supply in Old Crow and develop adaptation strategies to ameliorate their impact on the community. A community adaptation workshop was held on October 13, 2009, in which representatives of different stakeholders in the community discussed a variety of food security issues facing Old Crow and how they could be dealt with. Workshop data were analyzed using keyword, subject, and narrative analysis techniques to determine community priorities in food security and adaptation. Community concern is high and favored adaptation options include agriculture, improved food storage, and conservation through increased traditional education. These results were presented to the community for review and revision, after which the Vuntut Gwitchin Government will integrate them into its ongoing adaptation planning measures.

  7. Spatial analysis of alcohol-related motor vehicle crash injuries in southeastern Michigan.

    PubMed

    Meliker, Jaymie R; Maio, Ronald F; Zimmerman, Marc A; Kim, Hyungjin Myra; Smith, Sarah C; Wilson, Mark L

    2004-11-01

    Temporal, behavioral and social risk factors that affect injuries resulting from alcohol-related motor vehicle crashes have been characterized in previous research. Much less is known about spatial patterns and environmental associations of alcohol-related motor vehicle crashes. The aim of this study was to evaluate geographic patterns of alcohol-related motor vehicle crashes and to determine if locations of alcohol outlets are associated with those crashes. In addition, we sought to demonstrate the value of integrating spatial and traditional statistical techniques in the analysis of this preventable public health risk. The study design was a cross-sectional analysis of individual-level blood alcohol content, traffic report information, census block group data, and alcohol distribution outlets. Besag and Newell's spatial analysis and traditional logistic regression both indicated that areas of low population density had more alcohol-related motor vehicle crashes than expected (P < 0.05). There was no significant association between alcohol outlets and alcohol-related motor vehicle crashes using distance analyses, logistic regression, and Chi-square. Differences in environmental or behavioral factors characteristic of areas of low population density may be responsible for the higher proportion of alcohol-related crashes occurring in these areas.

  8. Modified McCash Technique for Management of Dupuytren Contracture.

    PubMed

    Lesiak, Alex C; Jarrett, Nicole J; Imbriglia, Joseph E

    2017-05-01

    Despite recent advancements in the nonsurgical treatment for Dupuytren contracture, a number of patients remain poor nonsurgical candidates or elect for surgical management. The traditional McCash technique releases contractures while leaving open palmar wounds. Although successful in alleviating contractures, these wounds are traditionally large, transverse incisions across the palm. A modification of this technique has been performed that permits the surgeon to utilize smaller wounds while eliminating debilitating contractures. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  9. Quality assurance paradigms for artificial intelligence in modelling and simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oren, T.I.

    1987-04-01

    New classes of quality assurance concepts and techniques are required for the advanced knowledge-processing paradigms (such as artificial intelligence, expert systems, or knowledge-based systems) and the complex problems that only simulative systems can cope with. A systematization of quality assurance problems as well as examples are given to traditional and cognizant quality assurance techniques in traditional and cognizant modelling and simulation.

  10. Application of gray level mapping in computed tomographic colonography: a pilot study to compare with traditional surface rendering method for identification and differentiation of endoluminal lesions

    PubMed Central

    Chen, Lih-Shyang; Hsu, Ta-Wen; Chang, Shu-Han; Lin, Chih-Wen; Chen, Yu-Ruei; Hsieh, Chin-Chiang; Han, Shu-Chen; Chang, Ku-Yaw; Hou, Chun-Ju

    2017-01-01

    Objective: In traditional surface rendering (SR) computed tomographic endoscopy, only the shape of endoluminal lesion is depicted without gray-level information unless the volume rendering technique is used. However, volume rendering technique is relatively slow and complex in terms of computation time and parameter setting. We use computed tomographic colonography (CTC) images as examples and report a new visualization technique by three-dimensional gray level mapping (GM) to better identify and differentiate endoluminal lesions. Methods: There are 33 various endoluminal cases from 30 patients evaluated in this clinical study. These cases were segmented using gray-level threshold. The marching cube algorithm was used to detect isosurfaces in volumetric data sets. GM is applied using the surface gray level of CTC. Radiologists conducted the clinical evaluation of the SR and GM images. The Wilcoxon signed-rank test was used for data analysis. Results: Clinical evaluation confirms GM is significantly superior to SR in terms of gray-level pattern and spatial shape presentation of endoluminal cases (p < 0.01) and improves the confidence of identification and clinical classification of endoluminal lesions significantly (p < 0.01). The specificity and diagnostic accuracy of GM is significantly better than those of SR in diagnostic performance evaluation (p < 0.01). Conclusion: GM can reduce confusion in three-dimensional CTC and well correlate CTC with sectional images by the location as well as gray-level value. Hence, GM increases identification and differentiation of endoluminal lesions, and facilitates diagnostic process. Advances in knowledge: GM significantly improves the traditional SR method by providing reliable gray-level information for the surface points and is helpful in identification and differentiation of endoluminal lesions according to their shape and density. PMID:27925483

  11. Direct cost comparison of minimally invasive punch technique versus traditional approaches for percutaneous bone anchored hearing devices.

    PubMed

    Sardiwalla, Yaeesh; Jufas, Nicholas; Morris, David P

    2017-06-12

    Minimally Invasive Ponto Surgery (MIPS) was recently described as a new technique to facilitate the placement of percutaneous bone anchored hearing devices. The procedure has resulted in a simplification of the surgical steps and a dramatic reduction in surgical time while maintaining excellent patient outcomes. Given these developments, our group sought to move the procedure from the main operating suite where they have traditionally been performed. This study aims to test the null hypothesis that MIPS and open approaches have the same direct costs for the implantation of percutaneous bone anchored hearing devices in a Canadian public hospital setting. A retrospective direct cost comparison of MIPS and open approaches for the implantation of bone conduction implants was conducted. Indirect and future costs were not included in the fiscal analysis. A simple cost comparison of the two approaches was made considering time, staff and equipment needs. All 12 operations were performed on adult patients from 2013 to 2016 by the same surgeon at a single hospital site. MIPS has a total mean reduction in cost of CAD$456.83 per operation from the hospital perspective when compared to open approaches. The average duration of the MIPS operation was 7 min, which is on average 61 min shorter compared with open approaches. The MIPS technique was more cost effective than traditional open approaches. This primarily reflects a direct consequence of a reduction in surgical time, with further contributions from reduced staffing and equipment costs. This simple, quick intervention proved to be feasible when performed outside the main operating room. A blister pack of required equipment could prove convenient and further reduce costs.

  12. Hydrogeology from 10,000 ft below: lessons learned in applying pulse testing for leakage detection in a carbon sequestration formation

    NASA Astrophysics Data System (ADS)

    Sun, A. Y.; Lu, J.; Hovorka, S. D.; Freifeld, B. M.; Islam, A.

    2015-12-01

    Monitoring techniques capable of deep subsurface detection are desirable for early warning and leakage pathway identification in geologic carbon storage formations. This work investigates the feasibility of a leakage detection technique based on pulse testing, which is a traditional hydrogeological characterization tool. In pulse testing, the monitoring reservoir is stimulated at a fixed frequency and the acquired pressure perturbation signals are analyzed in the frequency domain to detect potential deviations in the reservoir's frequency domain response function. Unlike traditional time-domain analyses, the frequency-domain analysis aims to minimize the interference of reservoir noise by imposing coded injection patterns such that the reservoir responses to injection can be uniquely determined. We have established the theoretical basis of the approach in previous work. Recently, field validation of this pressure-based, leakage detection technique was conducted at a CO2-EOR site located in Mississippi, USA. During the demonstration, two sets of experiments were performed using 90-min and 150-min pulsing periods, for both with and without leak scenarios. Because of the lack of pre-existing leakage pathways, artificial leakage CO2 was simulated by rate-controlled venting from one of the monitoring wells. Our results show that leakage events caused a significant deviation in the amplitude of the frequency response function, indicating that pulse testing may be used as a cost-effective monitoring technique with a strong potential for automation.

  13. P. falciparum in vitro killing rates allow to discriminate between different antimalarial mode-of-action.

    PubMed

    Sanz, Laura M; Crespo, Benigno; De-Cózar, Cristina; Ding, Xavier C; Llergo, Jose L; Burrows, Jeremy N; García-Bustos, Jose F; Gamo, Francisco-Javier

    2012-01-01

    Chemotherapy is still the cornerstone for malaria control. Developing drugs against Plasmodium parasites and monitoring their efficacy requires methods to accurately determine the parasite killing rate in response to treatment. Commonly used techniques essentially measure metabolic activity as a proxy for parasite viability. However, these approaches are susceptible to artefacts, as viability and metabolism are two parameters that are coupled during the parasite life cycle but can be differentially affected in response to drug actions. Moreover, traditional techniques do not allow to measure the speed-of-action of compounds on parasite viability, which is an essential efficacy determinant. We present here a comprehensive methodology to measure in vitro the direct effect of antimalarial compounds over the parasite viability, which is based on limiting serial dilution of treated parasites and re-growth monitoring. This methodology allows to precisely determine the killing rate of antimalarial compounds, which can be quantified by the parasite reduction ratio and parasite clearance time, which are key mode-of-action parameters. Importantly, we demonstrate that this technique readily permits to determine compound killing activities that might be otherwise missed by traditional, metabolism-based techniques. The analysis of a large set of antimalarial drugs reveals that this viability-based assay allows to discriminate compounds based on their antimalarial mode-of-action. This approach has been adapted to perform medium throughput screening, facilitating the identification of fast-acting antimalarial compounds, which are crucially needed for the control and possibly the eradication of malaria.

  14. P. falciparum In Vitro Killing Rates Allow to Discriminate between Different Antimalarial Mode-of-Action

    PubMed Central

    Sanz, Laura M.; Crespo, Benigno; De-Cózar, Cristina; Ding, Xavier C.; Llergo, Jose L.; Burrows, Jeremy N.; García-Bustos, Jose F.; Gamo, Francisco-Javier

    2012-01-01

    Chemotherapy is still the cornerstone for malaria control. Developing drugs against Plasmodium parasites and monitoring their efficacy requires methods to accurately determine the parasite killing rate in response to treatment. Commonly used techniques essentially measure metabolic activity as a proxy for parasite viability. However, these approaches are susceptible to artefacts, as viability and metabolism are two parameters that are coupled during the parasite life cycle but can be differentially affected in response to drug actions. Moreover, traditional techniques do not allow to measure the speed-of-action of compounds on parasite viability, which is an essential efficacy determinant. We present here a comprehensive methodology to measure in vitro the direct effect of antimalarial compounds over the parasite viability, which is based on limiting serial dilution of treated parasites and re-growth monitoring. This methodology allows to precisely determine the killing rate of antimalarial compounds, which can be quantified by the parasite reduction ratio and parasite clearance time, which are key mode-of-action parameters. Importantly, we demonstrate that this technique readily permits to determine compound killing activities that might be otherwise missed by traditional, metabolism-based techniques. The analysis of a large set of antimalarial drugs reveals that this viability-based assay allows to discriminate compounds based on their antimalarial mode-of-action. This approach has been adapted to perform medium throughput screening, facilitating the identification of fast-acting antimalarial compounds, which are crucially needed for the control and possibly the eradication of malaria. PMID:22383983

  15. ND 2 AV: N-dimensional data analysis and visualization analysis for the National Ignition Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, Peer -Timo; Maljovec, Dan; Saha, Avishek

    Here, one of the biggest challenges in high-energy physics is to analyze a complex mix of experimental and simulation data to gain new insights into the underlying physics. Currently, this analysis relies primarily on the intuition of trained experts often using nothing more sophisticated than default scatter plots. Many advanced analysis techniques are not easily accessible to scientists and not flexible enough to explore the potentially interesting hypotheses in an intuitive manner. Furthermore, results from individual techniques are often difficult to integrate, leading to a confusing patchwork of analysis snippets too cumbersome for data exploration. This paper presents a case study on how a combination of techniques from statistics, machine learning, topology, and visualization can have a significant impact in the field of inertial confinement fusion. We present themore » $$\\mathrm{ND}^2\\mathrm{AV}$$: N-dimensional data analysis and visualization framework, a user-friendly tool aimed at exploiting the intuition and current workflow of the target users. The system integrates traditional analysis approaches such as dimension reduction and clustering with state-of-the-art techniques such as neighborhood graphs and topological analysis, and custom capabilities such as defining combined metrics on the fly. All components are linked into an interactive environment that enables an intuitive exploration of a wide variety of hypotheses while relating the results to concepts familiar to the users, such as scatter plots. $$\\mathrm{ND}^2\\mathrm{AV}$$ uses a modular design providing easy extensibility and customization for different applications. $$\\mathrm{ND}^2\\mathrm{AV}$$ is being actively used in the National Ignition Campaign and has already led to a number of unexpected discoveries.« less

  16. ND 2 AV: N-dimensional data analysis and visualization analysis for the National Ignition Campaign

    DOE PAGES

    Bremer, Peer -Timo; Maljovec, Dan; Saha, Avishek; ...

    2015-07-01

    Here, one of the biggest challenges in high-energy physics is to analyze a complex mix of experimental and simulation data to gain new insights into the underlying physics. Currently, this analysis relies primarily on the intuition of trained experts often using nothing more sophisticated than default scatter plots. Many advanced analysis techniques are not easily accessible to scientists and not flexible enough to explore the potentially interesting hypotheses in an intuitive manner. Furthermore, results from individual techniques are often difficult to integrate, leading to a confusing patchwork of analysis snippets too cumbersome for data exploration. This paper presents a case study on how a combination of techniques from statistics, machine learning, topology, and visualization can have a significant impact in the field of inertial confinement fusion. We present themore » $$\\mathrm{ND}^2\\mathrm{AV}$$: N-dimensional data analysis and visualization framework, a user-friendly tool aimed at exploiting the intuition and current workflow of the target users. The system integrates traditional analysis approaches such as dimension reduction and clustering with state-of-the-art techniques such as neighborhood graphs and topological analysis, and custom capabilities such as defining combined metrics on the fly. All components are linked into an interactive environment that enables an intuitive exploration of a wide variety of hypotheses while relating the results to concepts familiar to the users, such as scatter plots. $$\\mathrm{ND}^2\\mathrm{AV}$$ uses a modular design providing easy extensibility and customization for different applications. $$\\mathrm{ND}^2\\mathrm{AV}$$ is being actively used in the National Ignition Campaign and has already led to a number of unexpected discoveries.« less

  17. Modeling Woven Polymer Matrix Composites with MAC/GMC

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Arnold, Steven M. (Technical Monitor)

    2000-01-01

    NASA's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) is used to predict the elastic properties of plain weave polymer matrix composites (PMCs). The traditional one step three-dimensional homogertization procedure that has been used in conjunction with MAC/GMC for modeling woven composites in the past is inaccurate due to the lack of shear coupling inherent to the model. However, by performing a two step homogenization procedure in which the woven composite repeating unit cell is homogenized independently in the through-thickness direction prior to homogenization in the plane of the weave, MAC/GMC can now accurately model woven PMCs. This two step procedure is outlined and implemented, and predictions are compared with results from the traditional one step approach and other models and experiments from the literature. Full coupling of this two step technique with MAC/ GMC will result in a widely applicable, efficient, and accurate tool for the design and analysis of woven composite materials and structures.

  18. Laser ablation-laser induced breakdown spectroscopy for the measurement of total elemental concentration in soils.

    PubMed

    Pareja, Jhon; López, Sebastian; Jaramillo, Daniel; Hahn, David W; Molina, Alejandro

    2013-04-10

    The performances of traditional laser-induced breakdown spectroscopy (LIBS) and laser ablation-LIBS (LA-LIBS) were compared by quantifying the total elemental concentration of potassium in highly heterogeneous solid samples, namely soils. Calibration curves for a set of fifteen samples with a wide range of potassium concentrations were generated. The LA-LIBS approach produced a superior linear response different than the traditional LIBS scheme. The analytical response of LA-LIBS was tested with a large set of different soil samples for the quantification of the total concentration of Fe, Mn, Mg, Ca, Na, and K. Results showed an acceptable linear response for Ca, Fe, Mg, and K while poor signal responses were found for Na and Mn. Signs of remaining matrix effects for the LA-LIBS approach in the case of soil analysis were found and discussed. Finally, some improvements and possibilities for future studies toward quantitative soil analysis with the LA-LIBS technique are suggested.

  19. ALIF: a new promising technique for the decomposition and analysis of nonlinear and nonstationary signals

    NASA Astrophysics Data System (ADS)

    Cicone, Antonio; Zhou, Haomin; Piersanti, Mirko; Materassi, Massimo; Spogli, Luca

    2017-04-01

    Nonlinear and nonstationary signals are ubiquitous in real life. Their decomposition and analysis is of crucial importance in many research fields. Traditional techniques, like Fourier and wavelet Transform have been proved to be limited in this context. In the last two decades new kind of nonlinear methods have been developed which are able to unravel hidden features of these kinds of signals. In this talk we will review the state of the art and present a new method, called Adaptive Local Iterative Filtering (ALIF). This method, developed originally to study mono-dimensional signals, unlike any other technique proposed so far, can be easily generalized to study two or higher dimensional signals. Furthermore, unlike most of the similar methods, it does not require any a priori assumption on the signal itself, so that the method can be applied as it is to any kind of signals. Applications of ALIF algorithm to real life signals analysis will be presented. Like, for instance, the behavior of the water level near the coastline in presence of a Tsunami, the length of the day signal, the temperature and pressure measured at ground level on a global grid, and the radio power scintillation from GNSS signals.

  20. Improving the limits of detection of low background alpha emission measurements

    NASA Astrophysics Data System (ADS)

    McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.

    2018-01-01

    Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.

  1. Analysis of CL-20 in environmental matrices: water and soil.

    PubMed

    Larson, Steven L; Felt, Deborah R; Davis, Jeffrey L; Escalon, Lynn

    2002-04-01

    Analytical techniques for the detection of 2,4,6,8,10,12-hexanitro-2,4,6,8,10,12-hexaazatetracyclo(5.5.0.05,9.03,11)dodecane (CL-20) in water and soil are developed by adapting methods traditionally used for the analysis of nitroaromatics. CL-20 (a new explosives compound) is thermally labile, exhibits high polarity, and has low solubility in water. These constraints make the use of specialized sample handling, preparation, extraction, and analysis necessary. The ability to determine the concentrations of this new explosive compound in environmental matrices is helpful in understanding the environmental fate and effects of CL-20; understanding the physical, chemical, and biological fate of CL-20; and can be used in developing remediation technologies and determining their efficiency. The toxicity and mobility of new explosives in soil and groundwater are also of interest, and analytical techniques for quantitating CL-20 and its degradation products in soil and natural waters make these investigations possible.

  2. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    NASA Astrophysics Data System (ADS)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  3. Proteomic data analysis of glioma cancer stem-cell lines based on novel nonlinear dimensional data reduction techniques

    NASA Astrophysics Data System (ADS)

    Lespinats, Sylvain; Pinker-Domenig, Katja; Wengert, Georg; Houben, Ivo; Lobbes, Marc; Stadlbauer, Andreas; Meyer-Bäse, Anke

    2016-05-01

    Glioma-derived cancer stem cells (GSCs) are tumor-initiating cells and may be refractory to radiation and chemotherapy and thus have important implications for tumor biology and therapeutics. The analysis and interpretation of large proteomic data sets requires the development of new data mining and visualization approaches. Traditional techniques are insufficient to interpret and visualize these resulting experimental data. The emphasis of this paper lies in the application of novel approaches for the visualization, clustering and projection representation to unveil hidden data structures relevant for the accurate interpretation of biological experiments. These qualitative and quantitative methods are applied to the proteomic analysis of data sets derived from the GSCs. The achieved clustering and visualization results provide a more detailed insight into the protein-level fold changes and putative upstream regulators for the GSCs. However the extracted molecular information is insufficient in classifying GSCs and paving the pathway to an improved therapeutics of the heterogeneous glioma.

  4. Analysis of polymeric phenolics in red wines using different techniques combined with gel permeation chromatography fractionation.

    PubMed

    Guadalupe, Zenaida; Soldevilla, Alberto; Sáenz-Navajas, María-Pilar; Ayestarán, Belén

    2006-04-21

    A multiple-step analytical method was developed to improve the analysis of polymeric phenolics in red wines. With a common initial step based on the fractionation of wine phenolics by gel permeation chromatography (GPC), different analytical techniques were used: high-performance liquid chromatography-diode array detection (HPLC-DAD), HPLC-mass spectrometry (MS), capillary zone electrophoresis (CZE) and spectrophotometry. This method proved to be valid for analyzing different families of phenolic compounds, such as monomeric phenolics and their derivatives, polymeric pigments and proanthocyanidins. The analytical characteristics of fractionation by GPC were studied and the method was fully validated, yielding satisfactory statistical results. GPC fractionation substantially improved the analysis of polymeric pigments by CZE, in terms of response, repeatability and reproducibility. It also represented an improvement in the traditional vanillin assay used for proanthocyanidin (PA) quantification. Astringent proanthocyanidins were also analyzed using a simple combined method that allowed these compounds, for which only general indexes were available, to be quantified.

  5. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites.

    PubMed

    Lietz, Christopher B; Gemperline, Erin; Li, Lingjun

    2013-07-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites

    PubMed Central

    Lietz, Christopher B.; Gemperline, Erin; Li, Lingjun

    2013-01-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. PMID:23603211

  7. Requirements and principles for the implementation and construction of large-scale geographic information systems

    NASA Technical Reports Server (NTRS)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  8. IGA: A Simplified Introduction and Implementation Details for Finite Element Users

    NASA Astrophysics Data System (ADS)

    Agrawal, Vishal; Gautam, Sachin S.

    2018-05-01

    Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.

  9. Geometric morphometrics in primatology: craniofacial variation in Homo sapiens and Pan troglodytes.

    PubMed

    Lynch, J M; Wood, C G; Luboga, S A

    1996-01-01

    Traditionally, morphometric studies have relied on statistical analysis of distances, angles or ratios to investigate morphometric variation among taxa. Recently, geometric techniques have been developed for the direct analysis of landmark data. In this paper, we offer a summary (with examples) of three of these newer techniques, namely shape coordinate, thin-plate spline and relative warp analyses. Shape coordinate analysis detected significant craniofacial variation between 4 modern human populations, with African and Australian Aboriginal specimens being relatively prognathous compared with their Eurasian counterparts. In addition, the Australian specimens exhibited greater basicranial flexion than all other samples. The observed relationships between size and craniofacial shape were weak. The decomposition of shape variation into affine and non-affine components is illustrated via a thin-plate spline analysis of Homo and Pan cranial landmarks. We note differences between Homo and Pan in the degree of prognathism and basicranial flexion and the position and orientation of the foramen magnum. We compare these results with previous studies of these features in higher primates and discuss the utility of geometric morphometrics as a tool in primatology and physical anthropology. We conclude that many studies of morphological variation, both within and between taxa, would benefit from the graphical nature of these techniques.

  10. Association mining of dependency between time series

    NASA Astrophysics Data System (ADS)

    Hafez, Alaaeldin

    2001-03-01

    Time series analysis is considered as a crucial component of strategic control over a broad variety of disciplines in business, science and engineering. Time series data is a sequence of observations collected over intervals of time. Each time series describes a phenomenon as a function of time. Analysis on time series data includes discovering trends (or patterns) in a time series sequence. In the last few years, data mining has emerged and been recognized as a new technology for data analysis. Data Mining is the process of discovering potentially valuable patterns, associations, trends, sequences and dependencies in data. Data mining techniques can discover information that many traditional business analysis and statistical techniques fail to deliver. In this paper, we adapt and innovate data mining techniques to analyze time series data. By using data mining techniques, maximal frequent patterns are discovered and used in predicting future sequences or trends, where trends describe the behavior of a sequence. In order to include different types of time series (e.g. irregular and non- systematic), we consider past frequent patterns of the same time sequences (local patterns) and of other dependent time sequences (global patterns). We use the word 'dependent' instead of the word 'similar' for emphasis on real life time series where two time series sequences could be completely different (in values, shapes, etc.), but they still react to the same conditions in a dependent way. In this paper, we propose the Dependence Mining Technique that could be used in predicting time series sequences. The proposed technique consists of three phases: (a) for all time series sequences, generate their trend sequences, (b) discover maximal frequent trend patterns, generate pattern vectors (to keep information of frequent trend patterns), use trend pattern vectors to predict future time series sequences.

  11. Potentials, challenges and limitations of the application of advanced surveying techniques in the Geosciences - Where are we and where we want to go?

    NASA Astrophysics Data System (ADS)

    Wagner, Bianca; Leiss, Bernd

    2017-04-01

    For some years now, an extreme rise in the development and the application of new surveying techniques is taking place in the Geosciences. Hence, the traditional field work has been altered massively by e.g. terrestrial Laserscanning, Unmanned Aerial Vehicles (UAVs), hyperspectral mapping or Structure-from-Motion (SfM). The next impetus for innovation is the demonstration and analysis of the digital models by means of Virtual Reality (VR) or Augmented Reality (AR). On the market, there are a lot of new field tools and devices, numerous free or commercial software packages as well as diverse solutions for visualization. Therefore and because of the attracting affordability and the ease of learning of some methods, the number of users is increasing permanently. However, what are the real scientific outcomes? Which methods make really sense compared to traditional field work and can be incorporated in everyday processes or teaching? Which standards does the community have and need? Where will be the challenges and trends in the upcoming years? Which accuracy and resolution do we need? What are the requirements in terms of sustainable (open) data management, presentation and advanced analysis methods of such data formats? Our contribution presents some answers as well as impulses to stimulate the discussion in the 3D survey and modeling community.

  12. How "Flipping" the Classroom Can Improve the Traditional Lecture

    ERIC Educational Resources Information Center

    Berrett, Dan

    2012-01-01

    In this article, the author discusses a teaching technique called "flipping" and describes how "flipping" the classroom can improve the traditional lecture. As its name suggests, flipping describes the inversion of expectations in the traditional college lecture. It takes many forms, including interactive engagement, just-in-time teaching (in…

  13. Circular dichroism spectroscopy: Enhancing a traditional undergraduate biochemistry laboratory experience.

    PubMed

    Lewis, Russell L; Seal, Erin L; Lorts, Aimee R; Stewart, Amanda L

    2017-11-01

    The undergraduate biochemistry laboratory curriculum is designed to provide students with experience in protein isolation and purification protocols as well as various data analysis techniques, which enhance the biochemistry lecture course and give students a broad range of tools upon which to build in graduate level laboratories or once they begin their careers. One of the most common biochemistry protein purification experiments is the isolation and characterization of cytochrome c. Students across the country purify cytochrome c, lysozyme, or some other well-known protein to learn these common purification techniques. What this series of experiments lacks is the use of sophisticated instrumentation that is rarely available to undergraduate students. To give students a broader background in biochemical spectroscopy techniques, a new circular dichroism (CD) laboratory experiment was introduced into the biochemistry laboratory curriculum. This CD experiment provides students with a means of conceptualizing the secondary structure of their purified protein, and assessments indicate that students' understanding of the technique increased significantly. Students conducted this experiment with ease and in a short time frame, so this laboratory is conducive to merging with other data analysis techniques within a single laboratory period. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(6):515-520, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  14. Advances in three-dimensional field analysis and evaluation of performance parameters of electrical machines

    NASA Astrophysics Data System (ADS)

    Sivasubramaniam, Kiruba

    This thesis makes advances in three dimensional finite element analysis of electrical machines and the quantification of their parameters and performance. The principal objectives of the thesis are: (1)the development of a stable and accurate method of nonlinear three-dimensional field computation and application to electrical machinery and devices; and (2)improvement in the accuracy of determination of performance parameters, particularly forces and torque computed from finite elements. Contributions are made in two general areas: a more efficient formulation for three dimensional finite element analysis which saves time and improves accuracy, and new post-processing techniques to calculate flux density values from a given finite element solution. A novel three-dimensional magnetostatic solution based on a modified scalar potential method is implemented. This method has significant advantages over the traditional total scalar, reduced scalar or vector potential methods. The new method is applied to a 3D geometry of an iron core inductor and a permanent magnet motor. The results obtained are compared with those obtained from traditional methods, in terms of accuracy and speed of computation. A technique which has been observed to improve force computation in two dimensional analysis using a local solution of Laplace's equation in the airgap of machines is investigated and a similar method is implemented in the three dimensional analysis of electromagnetic devices. A new integral formulation to improve force calculation from a smoother flux-density profile is also explored and implemented. Comparisons are made and conclusions drawn as to how much improvement is obtained and at what cost. This thesis also demonstrates the use of finite element analysis to analyze torque ripples due to rotor eccentricity in permanent magnet BLDC motors. A new method for analyzing torque harmonics based on data obtained from a time stepping finite element analysis of the machine is explored and implemented.

  15. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.

  16. A Comparison of Jump Height, Takeoff Velocities, and Blocking Coverage in the Swing and Traditional Volleyball Blocking Techniques

    PubMed Central

    Ficklin, Travis; Lund, Robin; Schipper, Megan

    2014-01-01

    The purpose of this study was to compare traditional and swing blocking techniques on center of mass (COM) projectile motion and effective blocking area in nine healthy Division I female volleyball players. Two high-definition (1080 p) video cameras (60 Hz) were used to collect two-dimensional variables from two separate views. One was placed perpendicular to the plane of the net and the other was directed along the top of the net, and were used to estimate COM locations and blocking area in a plane parallel to the net and hand penetration through the plane of the net respectively. Video of both the traditional and swing techniques were digitized and kinematic variables were calculated. Paired samples t-tests indicated that the swing technique resulted in greater (p < 0.05) vertical and horizontal takeoff velocities (vy and vx), jump height (H), duration of the block (tBLOCK), blocking coverage during the block (C) as well as hand penetration above and through the net’s plane (YPEN, ZPEN). The traditional technique had significantly greater approach time (tAPP). The results of this study suggest that the swing technique results in both greater jump height and effective blocking area. However, the shorter tAPP that occurs with swing is associated with longer times in the air during the block which may reduce the ability of the athlete to make adjustments to attacks designed to misdirect the defense. Key Points Swing blocking technique has greater jump height, effective blocking area, hand penetration, horizontal and vertical takeoff velocity, and has a shorter time of approach. Despite these advantages, there may be more potential for mistiming blocks and having erratic deflections of the ball after contact when using the swing technique. Coaches should take more than simple jump height and hand penetration into account when deciding which technique to employ. PMID:24570609

  17. A comparison of jump height, takeoff velocities, and blocking coverage in the swing and traditional volleyball blocking techniques.

    PubMed

    Ficklin, Travis; Lund, Robin; Schipper, Megan

    2014-01-01

    The purpose of this study was to compare traditional and swing blocking techniques on center of mass (COM) projectile motion and effective blocking area in nine healthy Division I female volleyball players. Two high-definition (1080 p) video cameras (60 Hz) were used to collect two-dimensional variables from two separate views. One was placed perpendicular to the plane of the net and the other was directed along the top of the net, and were used to estimate COM locations and blocking area in a plane parallel to the net and hand penetration through the plane of the net respectively. Video of both the traditional and swing techniques were digitized and kinematic variables were calculated. Paired samples t-tests indicated that the swing technique resulted in greater (p < 0.05) vertical and horizontal takeoff velocities (vy and vx), jump height (H), duration of the block (tBLOCK), blocking coverage during the block (C) as well as hand penetration above and through the net's plane (YPEN, ZPEN). The traditional technique had significantly greater approach time (tAPP). The results of this study suggest that the swing technique results in both greater jump height and effective blocking area. However, the shorter tAPP that occurs with swing is associated with longer times in the air during the block which may reduce the ability of the athlete to make adjustments to attacks designed to misdirect the defense. Key PointsSwing blocking technique has greater jump height, effective blocking area, hand penetration, horizontal and vertical takeoff velocity, and has a shorter time of approach.Despite these advantages, there may be more potential for mistiming blocks and having erratic deflections of the ball after contact when using the swing technique.Coaches should take more than simple jump height and hand penetration into account when deciding which technique to employ.

  18. Use of Mitomycin C to reduce the incidence of encapsulated cysts following ahmed glaucoma valve implantation in refractory glaucoma patients: a new technique.

    PubMed

    Zhou, Minwen; Wang, Wei; Huang, Wenbin; Zhang, Xiulan

    2014-09-06

    To evaluate the surgical outcome of Ahmed glaucoma valve (AGV) implantation with a new technique of mitomycin C (MMC) application. This is a retrospective study. All patients with refractory glaucoma underwent FP-7 AGV implantation. Two methods of MMC application were used. In the traditional technique, 6 × 4 mm cotton soaked with MMC (0.25-0.33 mg/ml) was placed in the implantation area for 2-5mins; in the new technique, the valve plate first was encompassed with a thin layer of cotton soaked with MMC, then inserted into the same area. A 200 ml balanced salt solution was applied for irrigation of MMC. The surgical success rate, intraocular pressure (IOP), number of anti-glaucoma medications used, and postoperative complications were analyzed between the groups. The surgical outcomes of two MMC applied techniques were compared. The new technique group had only one case (2.6%) of encapsulated cyst formation out of 38 eyes, while there were eight (19.5%) cases out of 41 eyes the in traditional group. The difference was statistically significant (P = 0.030). According to the definition of success rate, there was 89.5% in the new technique group and 70.7% in the traditional group at the follow-up end point. There was a significant difference between the two groups (P = 0.035). Mean IOP in the new technique group were significantly lower than those of the traditional group at 3 and 6 months (P < 0.05). By using a thin layer of cotton soaked with MMC to encompass the valve plate, the new MMC application technique could greatly decrease the incidence of encapsulated cyst and increase the success rate following AGV implantation.

  19. Laser-induced fluorescence spectroscopy for improved chemical analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gelbwachs, J.A.

    1983-09-01

    This report summarizes the progress achieved over the past five years in the laser-induced fluorescence spectroscopy (LIFS) for improved chemical analysis program. Our initial efforts yielded significantly lower detection limits for trace elemental analysis by the use of both cw and pulsed laser excitations. New methods of LIFS were developed that were shown to overcome many of the traditional limitations to LIFS techniques. LIFS methods have been applied to yield fundamental scientific data that further the understanding of forces between atoms and other atoms and molecules. In recent work, two-photon ionization was combined with LIFS and applied, for the firstmore » time, to the study of energy transfer in ions.« less

  20. Single-phase power distribution system power flow and fault analysis

    NASA Technical Reports Server (NTRS)

    Halpin, S. M.; Grigsby, L. L.

    1992-01-01

    Alternative methods for power flow and fault analysis of single-phase distribution systems are presented. The algorithms for both power flow and fault analysis utilize a generalized approach to network modeling. The generalized admittance matrix, formed using elements of linear graph theory, is an accurate network model for all possible single-phase network configurations. Unlike the standard nodal admittance matrix formulation algorithms, the generalized approach uses generalized component models for the transmission line and transformer. The standard assumption of a common node voltage reference point is not required to construct the generalized admittance matrix. Therefore, truly accurate simulation results can be obtained for networks that cannot be modeled using traditional techniques.

  1. Exploring Surface Analysis Techniques for the Detection of Molecular Contaminants on Spacecraft

    NASA Technical Reports Server (NTRS)

    Rutherford, Gugu N.; Seasly, Elaine; Thornblom, Mark; Baughman, James

    2016-01-01

    Molecular contamination is a known area of concern for spacecraft. To mitigate this risk, projects involving space flight hardware set requirements in a contamination control plan that establishes an allocation budget for the exposure of non-volatile residues (NVR) onto critical surfaces. The purpose of this work will focus on non-contact surface analysis and in situ monitoring to mitigate molecular contamination on space flight hardware. By using Scanning Electron Microscopy and Energy Dispersive Spectroscopy (SEM-EDS) with Raman Spectroscopy, an unlikely contaminant was identified on space flight hardware. Using traditional and surface analysis methods provided the broader view of the contamination sources allowing for best fit solutions to prevent any future exposure.

  2. Nonlinear, non-stationary image processing technique for eddy current NDE

    NASA Astrophysics Data System (ADS)

    Yang, Guang; Dib, Gerges; Kim, Jaejoon; Zhang, Lu; Xin, Junjun; Udpa, Lalita

    2012-05-01

    Automatic analysis of eddy current (EC) data has facilitated the analysis of large volumes of data generated in the inspection of steam generator tubes in nuclear power plants. The traditional procedure for analysis of EC data includes data calibration, pre-processing, region of interest (ROI) detection, feature extraction and classification. Accurate ROI detection has been enhanced by pre-processing, which involves reducing noise and other undesirable components as well as enhancing defect indications in the raw measurement. This paper presents the Hilbert-Huang Transform (HHT) for feature extraction and support vector machine (SVM) for classification. The performance is shown to significantly better than the existing rule based classification approach used in industry.

  3. [Advances of studies on new technology and method for identifying traditional Chinese medicinal materials].

    PubMed

    Chen, Shilin; Guo, Baolin; Zhang, Guijun; Yan, Zhuyun; Luo, Guangming; Sun, Suqin; Wu, Hezhen; Huang, Linfang; Pang, Xiaohui; Chen, Jianbo

    2012-04-01

    In this review, the authors summarized the new technologies and methods for identifying traditional Chinese medicinal materials, including molecular identification, chemical identification, morphological identification, microscopic identification and identification based on biological effects. The authors introduced the principle, characteristics, application and prospect on each new technology or method and compared their advantages and disadvantages. In general, new methods make the result more objective and accurate. DNA barcoding technique and spectroscopy identification have their owner obvious strongpoint in universality and digitalization. In the near future, the two techniques are promising to be the main trend for identifying traditional Chinese medicinal materials. The identification techniques based on microscopy, liquid chromatography, PCR, biological effects and DNA chip will be indispensable supplements. However, the bionic identification technology is just placed in the developing stage at present.

  4. Parametric methods for characterizing myocardial tissue by magnetic resonance imaging (part 2): T2 mapping.

    PubMed

    Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T

    2015-01-01

    Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.

  5. Practice of traditional Chinese medicine for psycho-behavioral intervention improves quality of life in cancer patients: A systematic review and meta-analysis.

    PubMed

    Tao, Weiwei; Luo, Xi; Cui, Bai; Liang, Dapeng; Wang, Chunli; Duan, Yangyang; Li, Xiaofen; Zhou, Shiyu; Zhao, Mingjie; Li, Yi; He, Yumin; Wang, Shaowu; Kelley, Keith W; Jiang, Ping; Liu, Quentin

    2015-11-24

    Cancer patients suffer from diverse symptoms, including depression, anxiety, pain, and fatigue and lower quality of life (QoL) during disease progression. This study aimed to evaluate the benefits of Traditional Chinese Medicine psycho-behavioral interventions (TCM PBIs) on improving QoL by meta-analysis. The six TCM PBIs analyzed were acupuncture, Chinese massage, Traditional Chinese Medicine five elements musical intervention (TCM FEMI), Traditional Chinese Medicine dietary supplement (TCM DS), Qigong and Tai Chi. Although both TCM PBIs and non-TCM PBIs reduced functional impairments in cancer patients and led to pain relief, depression remission, reduced time to flatulence following surgery and sleep improvement, TCM PBIs showed more beneficial effects as assessed by reducing both fatigue and gastrointestinal distress. In particular, acupuncture relieved fatigue, reduced diarrhea and decreased time to flatulence after surgery in cancer patients, while therapeutic Chinese massage reduced time to flatulence and time to peristaltic sound. Electronic literature databases (PubMed, CNKI, VIP, and Wanfang) were searched for randomized, controlled trials conducted in China. The primary intervention was TCM PBIs. The main outcome was health-related QoL (HR QoL) post-treatment. We applied standard meta analytic techniques to analyze data from papers that reached acceptable criteria. These findings demonstrate the efficacy of TCM PBIs in improving QoL in cancer patients and establish that TCM PBIs represent beneficial adjunctive therapies for cancer patients.

  6. “The Poison That Ruined the Nation”: Native American Men—Alcohol, Identity, and Traditional Healing

    PubMed Central

    Matamonasa-Bennett, Arieahn

    2015-01-01

    Alcoholism and destructive drinking patterns are serious social problems in many Native American reservation and urban communities. This qualitative study of men from a single Great Lakes reservation community examined the social, cultural, and psychological aspects of their alcohol problems through their life stories. The men were in various stages of recovery and sobriety, and data collection consisted of open-ended interviews and analysis utilizing principles and techniques from grounded theory and ethnographic content analysis. Alcoholism and other serious social problems facing Native American communities need to be understood in the sociocultural and historical contexts of colonization and historical grief and trauma. This study suggests that for Native American men, there are culturally specific perspectives on alcohol that have important implications for prevention and treatment of alcohol abuse. The participants’ narratives provided insight into the ways reconnecting with traditional cultural values (retraditionalization) helped them achieve sobriety. For these men, alcohol was highly symbolic of colonization as well as a protest to it. Alcohol was a means for affirming “Indian” identity and sobriety a means for reaffirming traditional tribal identity. Their narratives suggested the ways in which elements of traditional cultural values and practices facilitate healing in syncretic models and Nativized treatment. Understanding the ways in which specific Native cultural groups perceive their problems with drinking and sobriety can create more culturally congruent, culturally sensitive, and effective treatment approaches and inform future research. PMID:25812975

  7. Practice of traditional Chinese medicine for psycho-behavioral intervention improves quality of life in cancer patients: A systematic review and meta-analysis

    PubMed Central

    Liang, Dapeng; Wang, Chunli; Duan, Yangyang; Li, Xiaofen; Zhou, Shiyu; Zhao, Mingjie; Li, Yi; He, Yumin; Wang, Shaowu; Kelley, Keith W.; Jiang, Ping; Liu, Quentin

    2015-01-01

    Background Cancer patients suffer from diverse symptoms, including depression, anxiety, pain, and fatigue and lower quality of life (QoL) during disease progression. This study aimed to evaluate the benefits of Traditional Chinese Medicine psycho-behavioral interventions (TCM PBIs) on improving QoL by meta-analysis. Methods Electronic literature databases (PubMed, CNKI, VIP, and Wanfang) were searched for randomized, controlled trials conducted in China. The primary intervention was TCM PBIs. The main outcome was health-related QoL (HR QoL) post-treatment. We applied standard meta analytic techniques to analyze data from papers that reached acceptable criteria. Results The six TCM PBIs analyzed were acupuncture, Chinese massage, Traditional Chinese Medicine five elements musical intervention (TCM FEMI), Traditional Chinese Medicine dietary supplement (TCM DS), Qigong and Tai Chi. Although both TCM PBIs and non-TCM PBIs reduced functional impairments in cancer patients and led to pain relief, depression remission, reduced time to flatulence following surgery and sleep improvement, TCM PBIs showed more beneficial effects as assessed by reducing both fatigue and gastrointestinal distress. In particular, acupuncture relieved fatigue, reduced diarrhea and decreased time to flatulence after surgery in cancer patients, while therapeutic Chinese massage reduced time to flatulence and time to peristaltic sound. Conclusion These findings demonstrate the efficacy of TCM PBIs in improving QoL in cancer patients and establish that TCM PBIs represent beneficial adjunctive therapies for cancer patients. PMID:26498685

  8. Protein Sequencing with Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Ziady, Assem G.; Kinter, Michael

    The recent introduction of electrospray ionization techniques that are suitable for peptides and whole proteins has allowed for the design of mass spectrometric protocols that provide accurate sequence information for proteins. The advantages gained by these approaches over traditional Edman Degradation sequencing include faster analysis and femtomole, sometimes attomole, sensitivity. The ability to efficiently identify proteins has allowed investigators to conduct studies on their differential expression or modification in response to various treatments or disease states. In this chapter, we discuss the use of electrospray tandem mass spectrometry, a technique whereby protein-derived peptides are subjected to fragmentation in the gas phase, revealing sequence information for the protein. This powerful technique has been instrumental for the study of proteins and markers associated with various disorders, including heart disease, cancer, and cystic fibrosis. We use the study of protein expression in cystic fibrosis as an example.

  9. A Simple Method for Principal Strata Effects When the Outcome Has Been Truncated Due to Death

    PubMed Central

    Chiba, Yasutaka; VanderWeele, Tyler J.

    2011-01-01

    In randomized trials with follow-up, outcomes such as quality of life may be undefined for individuals who die before the follow-up is complete. In such settings, restricting analysis to those who survive can give rise to biased outcome comparisons. An alternative approach is to consider the “principal strata effect” or “survivor average causal effect” (SACE), defined as the effect of treatment on the outcome among the subpopulation that would have survived under either treatment arm. The authors describe a very simple technique that can be used to assess the SACE. They give both a sensitivity analysis technique and conditions under which a crude comparison provides a conservative estimate of the SACE. The method is illustrated using data from the ARDSnet (Acute Respiratory Distress Syndrome Network) clinical trial comparing low-volume ventilation and traditional ventilation methods for individuals with acute respiratory distress syndrome. PMID:21354986

  10. A simple method for principal strata effects when the outcome has been truncated due to death.

    PubMed

    Chiba, Yasutaka; VanderWeele, Tyler J

    2011-04-01

    In randomized trials with follow-up, outcomes such as quality of life may be undefined for individuals who die before the follow-up is complete. In such settings, restricting analysis to those who survive can give rise to biased outcome comparisons. An alternative approach is to consider the "principal strata effect" or "survivor average causal effect" (SACE), defined as the effect of treatment on the outcome among the subpopulation that would have survived under either treatment arm. The authors describe a very simple technique that can be used to assess the SACE. They give both a sensitivity analysis technique and conditions under which a crude comparison provides a conservative estimate of the SACE. The method is illustrated using data from the ARDSnet (Acute Respiratory Distress Syndrome Network) clinical trial comparing low-volume ventilation and traditional ventilation methods for individuals with acute respiratory distress syndrome.

  11. Japanese migration in contemporary Japan: economic segmentation and interprefectural migration.

    PubMed

    Fukurai, H

    1991-01-01

    This paper examines the economic segmentation model in explaining 1985-86 Japanese interregional migration. The analysis takes advantage of statistical graphic techniques to illustrate the following substantive issues of interregional migration: (1) to examine whether economic segmentation significantly influences Japanese regional migration and (2) to explain socioeconomic characteristics of prefectures for both in- and out-migration. Analytic techniques include a latent structural equation (LISREL) methodology and statistical residual mapping. The residual dispersion patterns, for instance, suggest the extent to which socioeconomic and geopolitical variables explain migration differences by showing unique clusters of unexplained residuals. The analysis further points out that extraneous factors such as high residential land values, significant commuting populations, and regional-specific cultures and traditions need to be incorporated in the economic segmentation model in order to assess the extent of the model's reliability in explaining the pattern of interprefectural migration.

  12. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  13. A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis

    NASA Astrophysics Data System (ADS)

    Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui

    2015-07-01

    Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.

  14. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  15. Measurement of Galactic Logarithmic Spiral Arm Pitch Angle Using Two-Dimensional Fast Fourier Transform Decomposition

    NASA Astrophysics Data System (ADS)

    Davis, Benjamin L.; Berrier, J. C.; Shields, D. W.; Kennefick, J.; Kennefick, D.; Seigar, M. S.; Lacy, C. H. S.; Puerari, I.

    2012-01-01

    A logarithmic spiral is a prominent feature appearing in a majority of observed galaxies. This feature has long been associated with the traditional Hubble classification scheme, but historical quotes of pitch angle of spiral galaxies have been almost exclusively qualitative. We have developed a methodology, utilizing Two-Dimensional Fast Fourier Transformations of images of spiral galaxies, in order to isolate and measure the pitch angles of their spiral arms. Our technique provides a quantitative way to measure this morphological feature. This will allow the precise comparison of spiral galaxy evolution to other galactic parameters and test spiral arm genesis theories. In this work, we detail our image processing and analysis of spiral galaxy images and discuss the robustness of our analysis techniques. The authors gratefully acknowledge support for this work from NASA Grant NNX08AW03A.

  16. Practical aspects of NMR signal assignment in larger and challenging proteins

    PubMed Central

    Frueh, Dominique P.

    2014-01-01

    NMR has matured into a technique routinely employed for studying proteins in near physiological conditions. However, applications to larger proteins are impeded by the complexity of the various correlation maps necessary to assign NMR signals. This article reviews the data analysis techniques traditionally employed for resonance assignment and describes alternative protocols necessary for overcoming challenges in large protein spectra. In particular, simultaneous analysis of multiple spectra may help overcome ambiguities or may reveal correlations in an indirect manner. Similarly, visualization of orthogonal planes in a multidimensional spectrum can provide alternative assignment procedures. We describe examples of such strategies for assignment of backbone, methyl, and nOe resonances. We describe experimental aspects of data acquisition for the related experiments and provide guidelines for preliminary studies. Focus is placed on large folded monomeric proteins and examples are provided for 37, 48, 53, and 81 kDa proteins. PMID:24534088

  17. UMA/GAN network architecture analysis

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Li, Wensheng; Deng, Chunjian; Lv, Yi

    2009-07-01

    This paper is to critically analyze the architecture of UMA which is one of Fix Mobile Convergence (FMC) solutions, and also included by the third generation partnership project(3GPP). In UMA/GAN network architecture, UMA Network Controller (UNC) is the key equipment which connects with cellular core network and mobile station (MS). UMA network could be easily integrated into the existing cellular networks without influencing mobile core network, and could provides high-quality mobile services with preferentially priced indoor voice and data usage. This helps to improve subscriber's experience. On the other hand, UMA/GAN architecture helps to integrate other radio technique into cellular network which includes WiFi, Bluetooth, and WiMax and so on. This offers the traditional mobile operators an opportunity to integrate WiMax technique into cellular network. In the end of this article, we also give an analysis of potential influence on the cellular core networks ,which is pulled by UMA network.

  18. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  19. Enhanced Higgs boson to τ(+)τ(-) search with deep learning.

    PubMed

    Baldi, P; Sadowski, P; Whiteson, D

    2015-03-20

    The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.

  20. Comparison of outcomes for single-incision laparoscopic inguinal herniorrhaphy and traditional three-port laparoscopic herniorrhaphy at a single institution.

    PubMed

    Buckley, F Paul; Vassaur, Hannah; Monsivais, Sharon; Sharp, Nicole E; Jupiter, Daniel; Watson, Rob; Eckford, John

    2014-01-01

    Evidence in the literature regarding the potential of single-incision laparoscopic (SILS) inguinal herniorrhaphy currently is limited. A retrospective comparison of SILS and traditional multiport laparoscopic (MP) inguinal hernia repair was conducted to assess the safety and feasibility of the minimally invasive laparoscopic technique. All laparoscopic inguinal hernia repairs performed by three surgeons at a single institution during 4 years were reviewed. Statistical evaluation included descriptive analysis of demographics including age, gender, body mass index (BMI), and hernia location (uni- or bilateral), in addition to bivariate and multivariate analyses of surgical technique and outcomes including operative times, conversions, and complications. The study compared 129 patients who underwent SILS inguinal hernia repair and 76 patients who underwent MP inguinal hernia repair. The cases included 190 men (92.68 %) with a mean age of 55.36 ± 18.01 years (range, 8-86 years) and a mean BMI of 26.49 ± 4.33 kg/m(2) (range, 17.3-41.7 kg/m(2)). These variables did not differ significantly between the SILS and MP cohorts. The average operative times for the SILS and MP unilateral cases were respectively 57.51 and 66.96 min. For the bilateral cases, the average operative times were 81.07 min for SILS and 81.38 min for MP. A multivariate analysis using surgical approach, BMI, case complexity, and laterality as the covariates demonstrated noninferiority of the SILS technique in terms of operative time (p = 0.031). No conversions from SILS to MP occurred, and the rates of conversion to open procedure did not differ significantly between the cohorts (p = 1.00, Fisher's exact test), nor did the complication rates (p = 0.65, χ (2)). As shown by the findings, SILS inguinal herniorrhaphy is a safe and feasible alternative to traditional MP inguinal hernia repair and can be performed successfully with similar operative times, conversion rates, and complication rates. Prospective trials are essential to confirm equivalence in these areas and to detect differences in patient-centered outcomes.

  1. A new technique to prepare hard fruits and seeds for anatomical studies1

    PubMed Central

    Benedict, John C.

    2015-01-01

    Premise of the study: A novel preparation technique was developed to examine fruits and seeds of plants with exceptionally hard or brittle tissues that are very difficult to prepare using standard histological techniques. Methods and Results: The method introduced here was modified from a technique employed on fossil material and has been adapted for use on fruits and seeds of extant plants. A variety of fruits and seeds have been prepared with great success, and the technique will be useful for any excessively hard fruits or seeds that are not able to be prepared using traditional embedding or sectioning methods. Conclusions: When compared to existing techniques for obtaining anatomical features of fruits and seeds, the protocol described here has the potential to create high-quality thin sections of materials that are not able to be sectioned using traditional histological techniques, which can be produced quickly and without the need for harmful chemicals. PMID:26504684

  2. An exploration of tutors' experiences of facilitating problem-based learning. Part 1--an educational research methodology combining innovation and philosophical tradition.

    PubMed

    Haith-Cooper, Melanie

    2003-01-01

    The use of problem-based learning (PBL) in Health Professional curricula is becoming more wide spread. Although the way in which the tutor facilitates PBL can have a major impact on students' learning (Andrews and Jones 1996), the literature provides little consistency as to how the tutor can effectively facilitate PBL (Haith-Cooper 2000). It is therefore important to examine the facilitation role to promote effective learning through the use of PBL. This article is the first of two parts exploring a study that was undertaken to investigate tutors' experiences of facilitating PBL. This part focuses on the methodology and the combining of innovative processes with traditional philosophical traditions to develop a systematic educational research methodology. The study was undertaken respecting the philosophy of hermeneutic phenomenology but utilised alternative data collection and analysis technique. Video conferencing and e-mail were used in conjunction with more traditional processes to access a worldwide sample. This paper explores some of the issues that arose when undertaking such a study. The second article then focuses on exploring the findings of the study and their implications for the facilitation of PBL.

  3. Perceptions of Playing-Related Musculoskeletal Disorders (PRMDs) in Irish traditional musicians: a focus group study.

    PubMed

    Wilson, Iseult M; Doherty, Liz; McKeown, Laura

    2014-01-01

    Playing-related musculoskeletal disorders (PRMDs) are common in musicians and interfere with the ability to play an instrument at the accustomed level. There is limited research into injuries affecting folk musicians. To explore the Irish traditional musicians' experience of PRMDs. Focus group interviews were conducted in 2011 and 2012, in two venues in Ireland. Data were recorded and transcribed verbatim. Data collection ended when no new findings emerged from the analysis of interviews. The inclusion criteria were: males or females aged 18 and above, and who taught or played Irish traditional music on any instrument. The data were analysed using the interpretative phenomenological method. All participants (n=22) believed there was a link between playing music and musculoskeletal problems. The main body areas affected were the back, shoulders, arms and hands. The main theme that emerged was: 'PRMDs are an integral part of being a traditional musician', and that the musical experience was generally prioritised over the health of the musician. There were sub-themes of 'fear' and 'stresses that contributed to PRMDs'. PRMDs are an occupational hazard for Irish musicians. There is an awareness of PRMDs, but changes (technique, environment) may threaten identity.

  4. Automatic classification of animal vocalizations

    NASA Astrophysics Data System (ADS)

    Clemins, Patrick J.

    2005-11-01

    Bioacoustics, the study of animal vocalizations, has begun to use increasingly sophisticated analysis techniques in recent years. Some common tasks in bioacoustics are repertoire determination, call detection, individual identification, stress detection, and behavior correlation. Each research study, however, uses a wide variety of different measured variables, called features, and classification systems to accomplish these tasks. The well-established field of human speech processing has developed a number of different techniques to perform many of the aforementioned bioacoustics tasks. Melfrequency cepstral coefficients (MFCCs) and perceptual linear prediction (PLP) coefficients are two popular feature sets. The hidden Markov model (HMM), a statistical model similar to a finite autonoma machine, is the most commonly used supervised classification model and is capable of modeling both temporal and spectral variations. This research designs a framework that applies models from human speech processing for bioacoustic analysis tasks. The development of the generalized perceptual linear prediction (gPLP) feature extraction model is one of the more important novel contributions of the framework. Perceptual information from the species under study can be incorporated into the gPLP feature extraction model to represent the vocalizations as the animals might perceive them. By including this perceptual information and modifying parameters of the HMM classification system, this framework can be applied to a wide range of species. The effectiveness of the framework is shown by analyzing African elephant and beluga whale vocalizations. The features extracted from the African elephant data are used as input to a supervised classification system and compared to results from traditional statistical tests. The gPLP features extracted from the beluga whale data are used in an unsupervised classification system and the results are compared to labels assigned by experts. The development of a framework from which to build animal vocalization classifiers will provide bioacoustics researchers with a consistent platform to analyze and classify vocalizations. A common framework will also allow studies to compare results across species and institutions. In addition, the use of automated classification techniques can speed analysis and uncover behavioral correlations not readily apparent using traditional techniques.

  5. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    PubMed

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  6. Machine learning techniques applied to the determination of road suitability for the transportation of dangerous substances.

    PubMed

    Matías, J M; Taboada, J; Ordóñez, C; Nieto, P G

    2007-08-17

    This article describes a methodology to model the degree of remedial action required to make short stretches of a roadway suitable for dangerous goods transport (DGT), particularly pollutant substances, using different variables associated with the characteristics of each segment. Thirty-one factors determining the impact of an accident on a particular stretch of road were identified and subdivided into two major groups: accident probability factors and accident severity factors. Given the number of factors determining the state of a particular road segment, the only viable statistical methods for implementing the model were machine learning techniques, such as multilayer perceptron networks (MLPs), classification trees (CARTs) and support vector machines (SVMs). The results produced by these techniques on a test sample were more favourable than those produced by traditional discriminant analysis, irrespective of whether dimensionality reduction techniques were applied. The best results were obtained using SVMs specifically adapted to ordinal data. This technique takes advantage of the ordinal information contained in the data without penalising the computational load. Furthermore, the technique permits the estimation of the utility function that is latent in expert knowledge.

  7. A method for analyzing temporal patterns of variability of a time series from Poincare plots.

    PubMed

    Fishman, Mikkel; Jacono, Frank J; Park, Soojin; Jamasebi, Reza; Thungtong, Anurak; Loparo, Kenneth A; Dick, Thomas E

    2012-07-01

    The Poincaré plot is a popular two-dimensional, time series analysis tool because of its intuitive display of dynamic system behavior. Poincaré plots have been used to visualize heart rate and respiratory pattern variabilities. However, conventional quantitative analysis relies primarily on statistical measurements of the cumulative distribution of points, making it difficult to interpret irregular or complex plots. Moreover, the plots are constructed to reflect highly correlated regions of the time series, reducing the amount of nonlinear information that is presented and thereby hiding potentially relevant features. We propose temporal Poincaré variability (TPV), a novel analysis methodology that uses standard techniques to quantify the temporal distribution of points and to detect nonlinear sources responsible for physiological variability. In addition, the analysis is applied across multiple time delays, yielding a richer insight into system dynamics than the traditional circle return plot. The method is applied to data sets of R-R intervals and to synthetic point process data extracted from the Lorenz time series. The results demonstrate that TPV complements the traditional analysis and can be applied more generally, including Poincaré plots with multiple clusters, and more consistently than the conventional measures and can address questions regarding potential structure underlying the variability of a data set.

  8. Size and shape measurement in contemporary cephalometrics.

    PubMed

    McIntyre, Grant T; Mossey, Peter A

    2003-06-01

    The traditional method of analysing cephalograms--conventional cephalometric analysis (CCA)--involves the calculation of linear distance measurements, angular measurements, area measurements, and ratios. Because shape information cannot be determined from these 'size-based' measurements, an increasing number of studies employ geometric morphometric tools in the cephalometric analysis of craniofacial morphology. Most of the discussions surrounding the appropriateness of CCA, Procrustes superimposition, Euclidean distance matrix analysis (EDMA), thin-plate spline analysis (TPS), finite element morphometry (FEM), elliptical Fourier functions (EFF), and medial axis analysis (MAA) have centred upon mathematical and statistical arguments. Surprisingly, little information is available to assist the orthodontist in the clinical relevance of each technique. This article evaluates the advantages and limitations of the above methods currently used to analyse the craniofacial morphology on cephalograms and investigates their clinical relevance and possible applications.

  9. [Establishment of prescription research technology system in Chinese medicine secondary exploitation based on "component structure" theory].

    PubMed

    Cheng, Xu-Dong; Feng, Liang; Gu, Jun-Fei; Zhang, Ming-Hua; Jia, Xiao-Bin

    2014-11-01

    Chinese medicine prescriptions are the wisdom outcomes of traditional Chinese medicine (TCM) clinical treatment determinations which based on differentiation of symptoms and signs. Chinese medicine prescriptions are also the basis of secondary exploitation of TCM. The study on prescription helps to understand the material basis of its efficacy, pharmacological mechanism, which is an important guarantee for the modernization of traditional Chinese medicine. Currently, there is not yet dissertation n the method and technology system of basic research on the prescription of Chinese medicine. This paper focuses on how to build an effective system of prescription research technology. Based on "component structure" theory, a technology system contained four-step method that "prescription analysis, the material basis screening, the material basis of analysis and optimization and verify" was proposed. The technology system analyzes the material basis of the three levels such as Chinese medicine pieces, constituents and the compounds which could respect the overall efficacy of Chinese medicine. Ideas of prescription optimization, remodeling are introduced into the system. The technology system is the combination of the existing research and associates with new techniques and methods, which used for explore the research thought suitable for material basis research and prescription remodeling. The system provides a reference for the secondary development of traditional Chinese medicine, and industrial upgrading.

  10. Characterization of heavy-metal-contaminated sediment by using unsupervised multivariate techniques and health risk assessment.

    PubMed

    Wang, Yeuh-Bin; Liu, Chen-Wuing; Wang, Sheng-Wei

    2015-03-01

    This study characterized the sediment quality of the severely contaminated Erjen River in Taiwan by using multivariate analysis methods-including factor analysis (FA), self-organizing maps (SOMs), and positive matrix factorization (PMF)-and health risk assessment. The SOMs classified the dataset with similar heavy-metal-contaminated sediment into five groups. FA extracted three major factors-traditional electroplating and metal-surface processing factor, nontraditional heavy-metal-industry factor, and natural geological factor-which accounted for 80.8% of the variance. The SOMs and FA revealed the heavy-metal-contaminated-sediment hotspots in the middle and upper reaches of the major tributary in the dry season. The hazardous index value for health risk via ingestion was 0.302. PMF further qualified the source apportionment, indicating that traditional electroplating and metal-surface-processing industries comprised 47% of the health risk posed by heavy-metal-contaminated sediment. Contaminants discharged from traditional electroplating and metal-surface-processing industries in the middle and upper reaches of the major tributary must be eliminated first to improve the sediment quality in Erjen River. The proposed assessment framework for heavy-metal-contaminated sediment can be applied to contaminated-sediment river sites in other regions. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Student assessment of the educational benefits of using a CD-ROM for instruction of basic surgical skills.

    PubMed

    Howe, Lisa M; Boothe, Harry W; Hartsfield, Sandee M

    2005-01-01

    At Texas A&M University, introductory-level surgical lecture and laboratory notes were converted to a CD-ROM format that included illustrative photographs as well as instructional videos demonstrating the basic surgical skills that all students were required to master. The CD-ROM was distributed to all students in place of traditional paper notes in the second-year surgical class in the professional veterinary curriculum. The study reported here was designed to evaluate the educational benefits of the use of the CD-ROM in place of traditional paper notes by examining the attitudes and practices of students before and after exposure to the CD-ROM format. An anonymous survey was distributed to students in the second-year introductory surgery course on the first day of class and again on the last day of class. Responses to questions were tabulated, response frequencies determined, and Chi-square analysis performed to determine differences between initial and final responses. On the final survey, 89 per cent of students responded that the instructional videos definitely helped them prepare for the laboratory, and 77 per cent responded that they were more likely to practice techniques learned from the CD-ROM videos than those learned from traditional study materials. The majority of students believed that the CD-ROM improved both the course (60 per cent) and their learning experience (62 per cent) as compared to traditional paper notes. Including instructional videos on the CD-ROM enhanced the educational experience of the students by promoting preparedness for laboratories and promoting practice of techniques learned from the videos outside of the laboratory.

  12. A voxel-based approach to gray matter asymmetries.

    PubMed

    Luders, E; Gaser, C; Jancke, L; Schlaug, G

    2004-06-01

    Voxel-based morphometry (VBM) was used to analyze gray matter (GM) asymmetries in a large sample (n = 60) of male and female professional musicians with and without absolute pitch (AP). We chose to examine these particular groups because previous studies using traditional region-of-interest (ROI) analyses have shown differences in hemispheric asymmetry related to AP and gender. Voxel-based methods may have advantages over traditional ROI-based methods since the analysis can be performed across the whole brain with minimal user bias. After determining that the VBM method was sufficiently sensitive for the detection of differences in GM asymmetries between groups, we found that male AP musicians were more leftward lateralized in the anterior region of the planum temporale (PT) than male non-AP musicians. This confirmed the results of previous studies using ROI-based methods that showed an association between PT asymmetry and the AP phenotype. We further observed that male non-AP musicians revealed an increased leftward GM asymmetry in the postcentral gyrus compared to female non-AP musicians, again corroborating results of a previously published study using ROI-based methods. By analyzing hemispheric GM differences across our entire sample, we were able to partially confirm findings of previous studies using traditional morphometric techniques, as well as more recent, voxel-based analyses. In addition, we found some unusually pronounced GM asymmetries in our musician sample not previously detected in subjects unselected for musical training. Since we were able to validate gender- and AP-related brain asymmetries previously described using traditional ROI-based morphometric techniques, the results of our analyses support the use of VBM for examinations of GM asymmetries.

  13. Laser versus traditional techniques in cerebral and brain stem gliomas

    NASA Astrophysics Data System (ADS)

    Lombard, Gian F.

    1996-01-01

    In medical literature no significant studies have been published on the effectiveness of laser compared with traditional procedures in two series of cerebral gliomas; for this reason we have studied 220 tumors (200 supratentorial -- 20 brain stem gliomas), 110 operated upon with laser, 100 with conventional techniques. Four surgical protocols have been carried out: (1) traditional techniques; (2) carbon dioxide laser free hand; (3) carbon dioxide laser plus microscope; (4) multiple laser sources plus microscope plus neurosector plus CUSA. Two laser sources have been used alone or in combination (carbon dioxide -- Nd:YAG 1.06 or 1.32). Patients have been monitored for Karnofsky scale before and after operation, 12 - 24 and 36 months later; and for survival rate. Tumors were classified by histological examination, dimensions, vascularization, topography (critical or non critical areas). Results for supratentorial gliomas: survival time is the same in both series (laser and traditional). Post- op morbidity is significantly improved in the laser group (high grade sub-group); long term follow-up shows an improvement of quality of life until 36 months in the low grade sub-group.

  14. Experimental Aspects and Implementation of HPTLC

    NASA Astrophysics Data System (ADS)

    Patel, Rashmin B.; Patel, Mrunali R.; Batel, Bharat G.

    High-Performance Thin-Layer Chromatography (HPTLC) is a sophisticated instrumentation technique. It has been reported in many publications to provide excellent separation and qualitative and quantitative analysis of a wide range of compounds, such as herbal and botanical dietary supplements, nutraceuticals, traditional western medicines, traditional Chinese medicines, and Ayurvedic (Indian) medicines (Sharma 2008). Comparative studies have often found that HPTLC is superior to High-Performance Liquid Chromatography (HPLC) in terms of total cost and time required for analysis. HPTLC is an off-line process in which the various stages are carried out independently. Important features of HPTLC are the ability to analyze cruder samples containing multicomponents; application of large number of sample and a series of standards using the spray-on technique; choice of solvents for the HPTLC development is wide as the mobile phases are fully evaporated before the detection step; processing of standards and samples identically on the same plate, leading to better accuracy and precision of quantification; different and universal selective detection methods, and in situ spectra recording in sequence to obtain positive identification of fractions; storage of total sample on layer, without time constrains. In addition, it minimizes exposure risks and significantly reduces disposal problems of toxic organic effluents; thereby, reduce possibilities of environment pollution. In view of this, HPTLC-based methods could be considered as a good alternative as they are being explored as an important tool in routine analysis. This chapter provides detailed information regarding HPTLC-based analytical method development (Renger 1993; Renger 1998; Patel and Patel 2008; Patel et al. 2010).

  15. A baseline drift detrending technique for fast scan cyclic voltammetry.

    PubMed

    DeWaele, Mark; Oh, Yoonbae; Park, Cheonho; Kang, Yu Min; Shin, Hojin; Blaha, Charles D; Bennet, Kevin E; Kim, In Young; Lee, Kendall H; Jang, Dong Pyo

    2017-11-06

    Fast scan cyclic voltammetry (FSCV) has been commonly used to measure extracellular neurotransmitter concentrations in the brain. Due to the unstable nature of the background currents inherent in FSCV measurements, analysis of FSCV data is limited to very short amounts of time using traditional background subtraction. In this paper, we propose the use of a zero-phase high pass filter (HPF) as the means to remove the background drift. Instead of the traditional method of low pass filtering across voltammograms to increase the signal to noise ratio, a HPF with a low cutoff frequency was applied to the temporal dataset at each voltage point to remove the background drift. As a result, the HPF utilizing cutoff frequencies between 0.001 Hz and 0.01 Hz could be effectively used to a set of FSCV data for removing the drifting patterns while preserving the temporal kinetics of the phasic dopamine response recorded in vivo. In addition, compared to a drift removal method using principal component analysis, this was found to be significantly more effective in reducing the drift (unpaired t-test p < 0.0001, t = 10.88) when applied to data collected from Tris buffer over 24 hours although a drift removal method using principal component analysis also showed the effective background drift reduction. The HPF was also applied to 5 hours of FSCV in vivo data. Electrically evoked dopamine peaks, observed in the nucleus accumbens, were clearly visible even without background subtraction. This technique provides a new, simple, and yet robust, approach to analyse FSCV data with an unstable background.

  16. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  17. Applicability of Cone Beam Computed Tomography to the Assessment of the Vocal Tract before and after Vocal Exercises in Normal Subjects.

    PubMed

    Garcia, Elisângela Zacanti; Yamashita, Hélio Kiitiro; Garcia, Davi Sousa; Padovani, Marina Martins Pereira; Azevedo, Renata Rangel; Chiari, Brasília Maria

    2016-01-01

    Cone beam computed tomography (CBCT), which represents an alternative to traditional computed tomography and magnetic resonance imaging, may be a useful instrument to study vocal tract physiology related to vocal exercises. This study aims to evaluate the applicability of CBCT to the assessment of variations in the vocal tract of healthy individuals before and after vocal exercises. Voice recordings and CBCT images before and after vocal exercises performed by 3 speech-language pathologists without vocal complaints were collected and compared. Each participant performed 1 type of exercise, i.e., Finnish resonance tube technique, prolonged consonant "b" technique, or chewing technique. The analysis consisted of an acoustic analysis and tomographic imaging. Modifications of the vocal tract settings following vocal exercises were properly detected by CBCT, and changes in the acoustic parameters were, for the most part, compatible with the variations detected in image measurements. CBCT was shown to be capable of properly assessing the changes in vocal tract settings promoted by vocal exercises. © 2017 S. Karger AG, Basel.

  18. The floating anchored craniotomy

    PubMed Central

    Gutman, Matthew J.; How, Elena; Withers, Teresa

    2017-01-01

    Background: The “floating anchored” craniotomy is a technique utilized at our tertiary neurosurgery institution in which a traditional decompressive craniectomy has been substituted for a floating craniotomy. The hypothesized advantages of this technique include adequate decompression, reduction in the intracranial pressure, obviating the need for a secondary cranioplasty, maintained bone protection, preventing the syndrome of the trephined, and a potential reduction in axonal stretching. Methods: The bone plate is re-attached via multiple loosely affixed vicryl sutures, enabling decompression, but then ensuring the bone returns to its anatomical position once cerebral edema has subsided. Results: From the analysis of 57 consecutive patients analyzed at our institution, we have found that the floating anchored craniotomy is comparable to decompressive craniectomy for intracranial pressure reduction and has some significant theoretical advantages. Conclusions: Despite the potential advantages of techniques that avoid the need for a second cranioplasty, they have not been widely adopted and have been omitted from trials examining the utility of decompressive surgery. This retrospective analysis of prospectively collected data suggests that the floating anchored craniotomy may be applicable instead of decompressive craniectomy. PMID:28713633

  19. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    NASA Astrophysics Data System (ADS)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  20. Highly efficient peptide separations in proteomics Part 1. Unidimensional high performance liquid chromatography.

    PubMed

    Sandra, Koen; Moshir, Mahan; D'hondt, Filip; Verleysen, Katleen; Kas, Koen; Sandra, Pat

    2008-04-15

    Sample complexity and dynamic range constitute enormous challenges in proteome analysis. The back-end technology in typical proteomics platforms, namely mass spectrometry (MS), can only tolerate a certain complexity, has a limited dynamic range per spectrum and is very sensitive towards ion suppression. Therefore, component overlap has to be minimized for successful mass spectrometric analysis and subsequent protein identification and quantification. The present review describes the advances that have been made in liquid-based separation techniques with focus on the recent developments to boost the resolving power. The review is divided in two parts; the first part deals with unidimensional liquid chromatography and the second part with bi- and multidimensional liquid-based separation techniques. Part 1 mainly focuses on reversed-phase HPLC due to the fact that it is and will, in the near future, remain the technique of choice to be hyphenated with MS. The impact of increasing the column length, decreasing the particle diameter, replacing the traditional packed beds by monolithics, amongst others, is described. The review is complemented with data obtained in the laboratories of the authors.

  1. Rapid detection of terbufos in stomach contents using desorption electrospray ionization mass spectrometry.

    PubMed

    Wilson, Christina R; Mulligan, Christopher C; Strueh, Kurt D; Stevenson, Gregory W; Hooser, Stephen B

    2014-05-01

    Desorption electrospray ionization mass spectrometry (DESI-MS) is an emerging analytical technique that permits the rapid and direct analysis of biological or environmental samples under ambient conditions. Highlighting the versatility of this technique, DESI-MS has been used for the rapid detection of illicit drugs, chemical warfare agents, agricultural chemicals, and pharmaceuticals from a variety of sample matrices. In diagnostic veterinary toxicology, analyzing samples using traditional analytical instrumentation typically includes extensive sample extraction procedures, which can be time consuming and labor intensive. Therefore, efforts to expedite sample analyses are a constant goal for diagnostic toxicology laboratories. In the current report, DESI-MS was used to directly analyze stomach contents from a dog exposed to the organophosphate insecticide terbufos. The total DESI-MS analysis time required to confirm the presence of terbufos and diagnose organophosphate poisoning in this case was approximately 5 min. This highlights the potential of this analytical technique in the field of veterinary toxicology for the rapid diagnosis and detection of toxicants in biological samples. © 2014 The Author(s).

  2. Industrial and occupational ergonomics in the petrochemical process industry: a regression trees approach.

    PubMed

    Bevilacqua, M; Ciarapica, F E; Giacchetta, G

    2008-07-01

    This work is an attempt to apply classification tree methods to data regarding accidents in a medium-sized refinery, so as to identify the important relationships between the variables, which can be considered as decision-making rules when adopting any measures for improvement. The results obtained using the CART (Classification And Regression Trees) method proved to be the most precise and, in general, they are encouraging concerning the use of tree diagrams as preliminary explorative techniques for the assessment of the ergonomic, management and operational parameters which influence high accident risk situations. The Occupational Injury analysis carried out in this paper was planned as a dynamic process and can be repeated systematically. The CART technique, which considers a very wide set of objective and predictive variables, shows new cause-effect correlations in occupational safety which had never been previously described, highlighting possible injury risk groups and supporting decision-making in these areas. The use of classification trees must not, however, be seen as an attempt to supplant other techniques, but as a complementary method which can be integrated into traditional types of analysis.

  3. [Computer-assisted image processing for quantifying histopathologic variables in the healing of colonic anastomosis in dogs].

    PubMed

    Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C

    1997-01-01

    The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.

  4. TH-E-19A-01: Quality and Safety in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, E; Ezzell, G; Miller, B

    2014-06-15

    Clinical radiotherapy data clearly demonstrate the link between the quality and safety of radiation treatments and the outcome for patients. The medical physicist plays an essential role in this process. To ensure the highest quality treatments, the medical physicist must understand and employ modern quality improvement techniques. This extends well beyond the duties traditionally associated with prescriptive QA measures. This session will review the current best practices for improving quality and safety in radiation therapy. General elements of quality management will be reviewed including: what makes a good quality management structure, the use of prospective risk analysis such as FMEA,more » and the use of incident learning. All of these practices are recommended in society-level documents and are incorporated into the new Practice Accreditation program developed by ASTRO. To be effective, however, these techniques must be practical in a resource-limited environment. This session will therefore focus on practical tools such as the newly-released radiation oncology incident learning system, RO-ILS, supported by AAPM and ASTRO. With these general constructs in mind, a case study will be presented of quality management in an SBRT service. An example FMEA risk assessment will be presented along with incident learning examples including root cause analysis. As the physicist's role as “quality officer” continues to evolve it will be essential to understand and employ the most effective techniques for quality improvement. This session will provide a concrete overview of the fundamentals in quality and safety. Learning Objectives: Recognize the essential elements of a good quality management system in radiotherapy. Understand the value of incident learning and the AAPM/ASTRO ROILS incident learning system. Appreciate failure mode and effects analysis as a risk assessment tool and its use in resource-limited environments. Understand the fundamental principles of good error proofing that extends beyond traditional prescriptive QA measures.« less

  5. Model, analysis, and evaluation of the effects of analog VLSI arithmetic on linear subspace-based image recognition.

    PubMed

    Carvajal, Gonzalo; Figueroa, Miguel

    2014-07-01

    Typical image recognition systems operate in two stages: feature extraction to reduce the dimensionality of the input space, and classification based on the extracted features. Analog Very Large Scale Integration (VLSI) is an attractive technology to achieve compact and low-power implementations of these computationally intensive tasks for portable embedded devices. However, device mismatch limits the resolution of the circuits fabricated with this technology. Traditional layout techniques to reduce the mismatch aim to increase the resolution at the transistor level, without considering the intended application. Relating mismatch parameters to specific effects in the application level would allow designers to apply focalized mismatch compensation techniques according to predefined performance/cost tradeoffs. This paper models, analyzes, and evaluates the effects of mismatched analog arithmetic in both feature extraction and classification circuits. For the feature extraction, we propose analog adaptive linear combiners with on-chip learning for both Least Mean Square (LMS) and Generalized Hebbian Algorithm (GHA). Using mathematical abstractions of analog circuits, we identify mismatch parameters that are naturally compensated during the learning process, and propose cost-effective guidelines to reduce the effect of the rest. For the classification, we derive analog models for the circuits necessary to implement Nearest Neighbor (NN) approach and Radial Basis Function (RBF) networks, and use them to emulate analog classifiers with standard databases of face and hand-writing digits. Formal analysis and experiments show how we can exploit adaptive structures and properties of the input space to compensate the effects of device mismatch at the application level, thus reducing the design overhead of traditional layout techniques. Results are also directly extensible to multiple application domains using linear subspace methods. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Rapid analysis of adulterations in Chinese lotus root powder (LRP) by near-infrared (NIR) spectroscopy coupled with chemometric class modeling techniques.

    PubMed

    Xu, Lu; Shi, Peng-Tao; Ye, Zi-Hong; Yan, Si-Min; Yu, Xiao-Ping

    2013-12-01

    This paper develops a rapid analysis method for adulteration identification of a popular traditional Chinese food, lotus root powder (LRP), by near-infrared spectroscopy and chemometrics. 85 pure LRP samples were collected from 7 main lotus producing areas of China to include most if not all of the significant variations likely to be encountered in unknown authentic materials. To evaluate the model specificity, 80 adulterated LRP samples prepared by blending pure LRP with different levels of four cheaper and commonly used starches were measured and predicted. For multivariate quality models, two class modeling methods, the traditional soft independent modeling of class analogy (SIMCA) and a recently proposed partial least squares class model (PLSCM) were used. Different data preprocessing techniques, including smoothing, taking derivative and standard normal variate (SNV) transformation were used to improve the classification performance. The results indicate that smoothing, taking second-order derivatives and SNV can improve the class models by enhancing signal-to-noise ratio, reducing baseline and background shifts. The most accurate and stable models were obtained with SNV spectra for both SIMCA (sensitivity 0.909 and specificity 0.938) and PLSCM (sensitivity 0.909 and specificity 0.925). Moreover, both SIMCA and PLSCM could detect LRP samples mixed with 5% (w/w) or more other cheaper starches, including cassava, sweet potato, potato and maize starches. Although it is difficult to perform an exhaustive collection of all pure LRP samples and possible adulterations, NIR spectrometry combined with class modeling techniques provides a reliable and effective method to detect most of the current LRP adulterations in Chinese market. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    PubMed

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  8. Impact of proto-oncogene mutation detection in cytological specimens from thyroid nodules improves the diagnostic accuracy of cytology.

    PubMed

    Cantara, Silvia; Capezzone, Marco; Marchisotta, Stefania; Capuano, Serena; Busonero, Giulia; Toti, Paolo; Di Santo, Andrea; Caruso, Giuseppe; Carli, Anton Ferdinando; Brilli, Lucia; Montanaro, Annalisa; Pacini, Furio

    2010-03-01

    Fine-needle aspiration cytology (FNAC) is the gold standard for the differential diagnosis of thyroid nodules but has the limitation of inadequate sampling or indeterminate lesions. We aimed to verify whether search of thyroid cancer-associated protooncogene mutations in cytological samples may improve the diagnostic accuracy of FNAC. One hundred seventy-four consecutive patients undergoing thyroid surgery were submitted to FNAC (on 235 thyroid nodules) that was used for cytology and molecular analysis of BRAF, RAS, RET, TRK, and PPRgamma mutations. At surgery these nodules were sampled to perform the same molecular testing. Mutations were found in 67 of 235 (28.5%) cytological samples. Of the 67 mutated samples, 23 (34.3%) were mutated by RAS, 33 (49.3%) by BRAF, and 11 (16.4%) by RET/PTC. In 88.2% of the cases, the mutation was confirmed in tissue sample. The presence of mutations at cytology was associated with cancer 91.1% of the times and follicular adenoma 8.9% of the time. BRAF or RET/PTC mutations were always associated with cancer, whereas RAS mutations were mainly associated with cancer (74%) but also follicular adenoma (26%). The diagnostic performance of molecular analysis was superior to that of traditional cytology, with better sensitivity and specificity, and the combination of the two techniques further contributed to improve the total accuracy (93.2%), compared with molecular analysis (90.2%) or traditional cytology (83.0%). Our findings demonstrate that molecular analysis of cytological specimens is feasible and that its results in combination with cytology improves the diagnostic performance of traditional cytology.

  9. Beverage and culture. "Zhourat", a multivariate analysis of the globalization of a herbal tea from the Middle East.

    PubMed

    Obón, Concepción; Rivera, Diego; Alcaraz, Francisco; Attieh, Latiffa

    2014-08-01

    The "Zhourat" herbal tea consists of a blend of wild flowers, herbs, leaves and fruits and is a typical beverage of Lebanon and Syria. We aim to evaluate cultural significance of "Zhourat", to determine cultural standards for its formulation including key ingredients and to determine acceptable variability levels in terms of number of ingredients and their relative proportions, in summary what is "Zhourat" and what is not "Zhourat" from an ethnobotanical perspective. For this purpose we develop a novel methodology to describe and analyse patterns of variation of traditional multi-ingredient herbal formulations, beverages and teas and to identify key ingredients, which are characteristics of a particular culture and region and to interpret health claims for the mixture. Factor analysis and hierarchical clustering techniques were used to display similarities between samples whereas salience index was used to determine the main ingredients which could help to distinguish a standard traditional blend from a global market-addressed formulation. The study revealed 77 main ingredients belonging to 71 different species of vascular plants. In spite of the "Zhourat's" highly variable content, the salience analysis resulted in a determined set of key botanical components including Rosa x damascena Herrm., Althaea damascena Mouterde, Matricaria chamomilla L., Aloysia citrodora Palau, Zea mays L. and Elaeagnus angustifolia L. The major health claims for "Zhourat" as digestive, sedative and for respiratory problems are culturally coherent with the analysis of the traditional medicinal properties uses of its ingredients. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. An Investigation of the Characterization of Cloud Contamination in Hyperspectral Radiances

    NASA Technical Reports Server (NTRS)

    McCarty, William; Jedlovec, Gary J.; LeMarshall, John

    2007-01-01

    In regions lacking direct observations, the assimilation of radiances from infrared and microwave sounders is the primary method for characterizing the atmosphere in the analysis process. In recent years, technological advances have led to the launching of more advanced sounders, particularly in the thermal infrared spectrum. With the advent of these hyperspectral sounders, the amount of data available for the analysis process has and will continue to be dramatically increased. However, the utilization of infrared radiances in variational assimilation can be problematic in the presence of clouds; specifically the assessment of the presence of clouds in an instantaneous field of view (IFOV) and the contamination in the individual channels within the IFOV. Various techniques have been developed to determine if a channel is contaminated by clouds. The work presented in this paper and subsequent presentation will investigate traditional techniques and compare them to a new technique, the C02 sorting technique, which utilizes the high spectral resolution of the Atmospheric Infrared Sounder (AIRS) within the framework of the Gridpoint Statistical Interpolation (GSI) 3DVAR system. Ultimately, this work is done in preparation for the assessment of short-term forecast impacts with the regional assimilation of AIRS radiances within the analysis fields of the Weather Research and Forecast Nonhydrostatic Mesoscale Model (WRF-NMM) at the NASA Short-term Prediction Research and Transition (SPORT) Center.

  11. The use of single-date MODIS imagery for estimating large-scale urban impervious surface fraction with spectral mixture analysis and machine learning techniques

    NASA Astrophysics Data System (ADS)

    Deng, Chengbin; Wu, Changshan

    2013-12-01

    Urban impervious surface information is essential for urban and environmental applications at the regional/national scales. As a popular image processing technique, spectral mixture analysis (SMA) has rarely been applied to coarse-resolution imagery due to the difficulty of deriving endmember spectra using traditional endmember selection methods, particularly within heterogeneous urban environments. To address this problem, we derived endmember signatures through a least squares solution (LSS) technique with known abundances of sample pixels, and integrated these endmember signatures into SMA for mapping large-scale impervious surface fraction. In addition, with the same sample set, we carried out objective comparative analyses among SMA (i.e. fully constrained and unconstrained SMA) and machine learning (i.e. Cubist regression tree and Random Forests) techniques. Analysis of results suggests three major conclusions. First, with the extrapolated endmember spectra from stratified random training samples, the SMA approaches performed relatively well, as indicated by small MAE values. Second, Random Forests yields more reliable results than Cubist regression tree, and its accuracy is improved with increased sample sizes. Finally, comparative analyses suggest a tentative guide for selecting an optimal approach for large-scale fractional imperviousness estimation: unconstrained SMA might be a favorable option with a small number of samples, while Random Forests might be preferred if a large number of samples are available.

  12. The Strengths and Weaknesses of Logic Formalisms to Support Mishap Analysis

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2002-01-01

    The increasing complexity of many safety critical systems poses new problems for mishap analysis. Techniques developed in the sixties and seventies cannot easily scale-up to analyze incidents involving tightly integrated software and hardware components. Similarly, the realization that many failures have systemic causes has widened the scope of many mishap investigations. Organizations, including NASA and the NTSB, have responded by starting research and training initiatives to ensure that their personnel are well equipped to meet these challenges. One strand of research has identified a range of mathematically based techniques that can be used to reason about the causes of complex, adverse events. The proponents of these techniques have argued that they can be used to formally prove that certain events created the necessary and sufficient causes for a mishap to occur. Mathematical proofs can reduce the bias that is often perceived to effect the interpretation of adverse events. Others have opposed the introduction of these techniques by identifying social and political aspects to incident investigation that cannot easily be reconciled with a logic-based approach. Traditional theorem proving mechanisms cannot accurately capture the wealth of inductive, deductive and statistical forms of inference that investigators routinely use in their analysis of adverse events. This paper summarizes some of the benefits that logics provide, describes their weaknesses, and proposes a number of directions for future research.

  13. Using object-oriented analysis techniques to support system testing

    NASA Astrophysics Data System (ADS)

    Zucconi, Lin

    1990-03-01

    Testing of real-time control systems can be greatly facilitated by use of object-oriented and structured analysis modeling techniques. This report describes a project where behavior, process and information models built for a real-time control system were used to augment and aid traditional system testing. The modeling techniques used were an adaptation of the Ward/Mellor method for real-time systems analysis and design (Ward85) for object-oriented development. The models were used to simulate system behavior by means of hand execution of the behavior or state model and the associated process (data and control flow) and information (data) models. The information model, which uses an extended entity-relationship modeling technique, is used to identify application domain objects and their attributes (instance variables). The behavioral model uses state-transition diagrams to describe the state-dependent behavior of the object. The process model uses a transformation schema to describe the operations performed on or by the object. Together, these models provide a means of analyzing and specifying a system in terms of the static and dynamic properties of the objects which it manipulates. The various models were used to simultaneously capture knowledge about both the objects in the application domain and the system implementation. Models were constructed, verified against the software as-built and validated through informal reviews with the developer. These models were then hand-executed.

  14. Permeabilization of brain tissue in situ enables multiregion analysis of mitochondrial function in a single mouse brain.

    PubMed

    Herbst, Eric A F; Holloway, Graham P

    2015-02-15

    Mitochondrial function in the brain is traditionally assessed through analysing respiration in isolated mitochondria, a technique that possesses significant tissue and time requirements while also disrupting the cooperative mitochondrial reticulum. We permeabilized brain tissue in situ to permit analysis of mitochondrial respiration with the native mitochondrial morphology intact, removing the need for isolation time and minimizing tissue requirements to ∼2 mg wet weight. The permeabilized brain technique was validated against the traditional method of isolated mitochondria and was then further applied to assess regional variation in the mouse brain with ischaemia-reperfusion injuries. A transgenic mouse model overexpressing catalase within mitochondria was applied to show the contribution of mitochondrial reactive oxygen species to ischaemia-reperfusion injuries in different brain regions. This technique enhances the accessibility of addressing physiological questions in small brain regions and in applying transgenic mouse models to assess mechanisms regulating mitochondrial function in health and disease. Mitochondria function as the core energy providers in the brain and symptoms of neurodegenerative diseases are often attributed to their dysregulation. Assessing mitochondrial function is classically performed in isolated mitochondria; however, this process requires significant isolation time, demand for abundant tissue and disruption of the cooperative mitochondrial reticulum, all of which reduce reliability when attempting to assess in vivo mitochondrial bioenergetics. Here we introduce a method that advances the assessment of mitochondrial respiration in the brain by permeabilizing existing brain tissue to grant direct access to the mitochondrial reticulum in situ. The permeabilized brain preparation allows for instant analysis of mitochondrial function with unaltered mitochondrial morphology using significantly small sample sizes (∼2 mg), which permits the analysis of mitochondrial function in multiple subregions within a single mouse brain. Here this technique was applied to assess regional variation in brain mitochondrial function with acute ischaemia-reperfusion injuries and to determine the role of reactive oxygen species in exacerbating dysfunction through the application of a transgenic mouse model overexpressing catalase within mitochondria. Through creating accessibility to small regions for the investigation of mitochondrial function, the permeabilized brain preparation enhances the capacity for examining regional differences in mitochondrial regulation within the brain, as the majority of genetic models used for unique approaches exist in the mouse model. © 2014 The Authors. The Journal of Physiology © 2014 The Physiological Society.

  15. Comparison of W-Plasty vs Traditional Straight-Line Techniques for Primary Paramedian Forehead Flap Donor Site Closure.

    PubMed

    Jáuregui, Emmanuel J; Tummala, Neelima; Seth, Rahul; Arron, Sarah; Neuhaus, Isaac; Yu, Siegrid; Grekin, Roy; Knott, P Daniel

    2016-07-01

    The paramedian forehead flap (PMFF) donor site scar is hard to disguise and may be a source of patient dissatisfaction. To evaluate the aesthetic outcome of W-plasty vs traditional straight-line (SL) closure techniques of the PMFF donor site. A retrospective cohort study was conducted at the University of California, San Francisco Medical Center. Clinical history and operative reports were reviewed for 31 patients who underwent a PMFF procedure performed between November 1, 2011, and May 29, 2014. Blinded photographic analysis of postoperative photographs was performed. The pedicled component of the PMFF was raised primarily with either a W-plasty or traditional SL design. Standard photographs of the donor site, obtained at least 90 days after surgery, were reviewed and scored in a blinded fashion by 4 dermatologic surgeons using a 100-point visual analog scale (from 0 [worst possible outcome] to 100 [best possible outcome]) and a 5-point Likert scale (from very poor to excellent). Interrater reliability was assessed via Cronbach α testing. All 31 forehead flaps survived during this study period; 16 PMFFs were raised with the W-plasty technique and 15 were raised with the SL technique. The W-plasty and SL groups were similar in terms of age, sex, and race/ethnicity (mean [SD] age, 68.4 [12.4] vs 61.8 [11.6] years; 13 [84%] vs 9 [60%] men; and 15 [94%] vs 13 [87%] white). Patients undergoing W-plasty closure had significantly higher mean visual analog scale scores compared with those undergoing SL closure (72.8 [18.3] vs 65.6 [18.1]; P = .03). Mean Likert scale scores for W-plasty were higher than those for SL closure, but the difference was not significant (3.77 [1.02] vs 3.43 [0.98]; P = .08). Overall interrater reliability for the visual analog scale and Likert scale scores were 0.67 and 0.58, respectively. Patients undergoing PMFF donor site closure using a primary W-plasty technique demonstrated better mean scar appearance of the forehead donor site compared with SL closure. The primary W-plasty technique did not result in any PMFF losses and should be considered for appropriate patients. 3.

  16. Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987

    NASA Technical Reports Server (NTRS)

    Gilmore, John F. (Editor)

    1987-01-01

    The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.

  17. Working Group 3: Operations Analysis for Systems of System within a Networked C2 Context: Introduction, Purpose, and Approach

    DTIC Science & Technology

    2012-01-01

    1200 Session 3 – C2 Framework, OR Methods MOOs, MOEs, MOPs Development Case Study – 1300-1630 Session 4 – Findings...Objective 1: Understand the impact of the application of traditional operational research techniques to networked C2 systems. • Objective 2: Develop ...for the network. 3. Cost measures including cost and time to implement the solution (for example, a basic rule-of-thumb I use for development

  18. Reducing software mass through behavior control. [of planetary roving robots

    NASA Technical Reports Server (NTRS)

    Miller, David P.

    1992-01-01

    Attention is given to the tradeoff between communication and computation as regards a planetary rover (both these subsystems are very power-intensive, and both can be the major driver of the rover's power subsystem, and therefore the minimum mass and size of the rover). Software techniques that can be used to reduce the requirements on both communciation and computation, allowing the overall robot mass to be greatly reduced, are discussed. Novel approaches to autonomous control, called behavior control, employ an entirely different approach, and for many tasks will yield a similar or superior level of autonomy to traditional control techniques, while greatly reducing the computational demand. Traditional systems have several expensive processes that operate serially, while behavior techniques employ robot capabilities that run in parallel. Traditional systems make extensive world models, while behavior control systems use minimal world models or none at all.

  19. Pharmacovigilance of herbal medicines: the potential contributions of ethnobotanical and ethnopharmacological studies.

    PubMed

    Rodrigues, Eliana; Barnes, Joanne

    2013-01-01

    Typically, ethnobotanical/ethnopharmacological (EB/EP) surveys are used to describe uses, doses/dosages, sources and methods of preparation of traditional herbal medicines; their application to date in examining the adverse effects, contraindications and other safety aspects of these preparations is limited. From a pharmacovigilance perspective, numerous challenges exist in applying its existing methods to studying the safety profile of herbal medicines, particularly where used by indigenous cultures. This paper aims to contribute to the methodological aspects of EB/EP field work, and to extend the reach of pharmacovigilance, by proposing a tool comprising a list of questions that could be applied during interview and observational studies. The questions focus on the collection of information on the safety profile of traditional herbal medicines as it is embedded in traditional knowledge, as well as on identifying personal experiences (spontaneous reports) of adverse or undesirable effects associated with the use of traditional herbal medicines. Questions on the precise composition of traditional prescriptions or 'recipes', their preparation, storage, administration and dosing are also included. Strengths and limitations of the tool are discussed. From this interweaving of EB/EP and pharmacovigilance arises a concept of ethnopharmacovigilance for traditional herbal medicines: the scope of EB/EP is extended to include exploration of the potential harmful effects of medicinal plants, and the incorporation of pharmacovigilance questions into EB/EP studies provides a new opportunity for collection of 'general' traditional knowledge on the safety of traditional herbal medicines and, importantly, a conduit for collection of spontaneous reports of suspected adverse effects. Whether the proposed tool can yield data sufficiently rich and of an appropriate quality for application of EB/EP (e.g. data verification and quantitative analysis tools) and pharmacovigilance techniques (e.g. causality assessment and data mining) requires field testing.

  20. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

Top