Sample records for modern analysis techniques

  1. Teaching Earth Signals Analysis Using the Java-DSP Earth Systems Edition: Modern and Past Climate Change

    ERIC Educational Resources Information Center

    Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.

    2014-01-01

    Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…

  2. Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis

    NASA Astrophysics Data System (ADS)

    Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr

    2017-10-01

    Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.

  3. An Introduction to Modern Missing Data Analyses

    ERIC Educational Resources Information Center

    Baraldi, Amanda N.; Enders, Craig K.

    2010-01-01

    A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…

  4. Application of modern tools and techniques to maximize engineering productivity in the development of orbital operations plans for the space station progrm

    NASA Technical Reports Server (NTRS)

    Manford, J. S.; Bennett, G. R.

    1985-01-01

    The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.

  5. An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques

    ERIC Educational Resources Information Center

    de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.

    2017-01-01

    This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…

  6. Sample preparation for the analysis of isoflavones from soybeans and soy foods.

    PubMed

    Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A

    2009-01-02

    This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.

  7. Supercritical fluid chromatography for GMP analysis in support of pharmaceutical development and manufacturing activities.

    PubMed

    Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J

    2016-01-05

    Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Pressure-Assisted Chelating Extraction as a Teaching Tool in Instrumental Analysis

    ERIC Educational Resources Information Center

    Sadik, Omowunmi A.; Wanekaya, Adam K.; Yevgeny, Gelfand

    2004-01-01

    A novel instrumental-digestion technique using pressure-assisted chelating extraction (PACE), for undergraduate laboratory is reported. This procedure is used for exposing students to safe sample-preparation techniques, for correlating wet-chemical methods with modern instrumental analysis and comparing the performance of PACE with conventional…

  9. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  10. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    DOE PAGES

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less

  11. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    PubMed Central

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632

  12. Safety analysis in test facility design

    NASA Astrophysics Data System (ADS)

    Valk, A.; Jonker, R. J.

    1990-09-01

    The application of safety analysis techniques as developed in, for example nuclear and petrochemical industry, can be very beneficial in coping with the increasing complexity of modern test facility installations and their operations. To illustrate the various techniques available and their phasing in a project, an overview of the most commonly used techniques is presented. Two case studies are described: the hazard and operability study techniques and safety zoning in relation to the possible presence of asphyxiating atmospheres.

  13. A Course in Heterogeneous Catalysis: Principles, Practice, and Modern Experimental Techniques.

    ERIC Educational Resources Information Center

    Wolf, Eduardo E.

    1981-01-01

    Outlines a multidisciplinary course which comprises fundamental, practical, and experimental aspects of heterogeneous catalysis. The course structure is a combination of lectures and demonstrations dealing with the use of spectroscopic techniques for surface analysis. (SK)

  14. Identification of Microorganisms by Modern Analytical Techniques.

    PubMed

    Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica

    2017-11-01

    Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.

  15. Pedagogical Approach to the Modeling and Simulation of Oscillating Chemical Systems with Modern Software: The Brusselator Model

    ERIC Educational Resources Information Center

    Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca

    2018-01-01

    A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…

  16. Practical Problems in the Cement Industry Solved by Modern Research Techniques

    ERIC Educational Resources Information Center

    Daugherty, Kenneth E.; Robertson, Les D.

    1972-01-01

    Practical chemical problems in the cement industry are being solved by such techniques as infrared spectroscopy, gas chromatography-mass spectrometry, X-ray diffraction, atomic absorption and arc spectroscopy, thermally evolved gas analysis, Mossbauer spectroscopy, transmission and scanning electron microscopy. (CP)

  17. Analytical description of the modern steam automobile

    NASA Technical Reports Server (NTRS)

    Peoples, J. A.

    1974-01-01

    The sensitivity of operating conditions upon performance of the modern steam automobile is discussed. The word modern has been used in the title to indicate that emphasis is upon miles per gallon rather than theoretical thermal efficiency. This has been accomplished by combining classical power analysis with the ideal Pressure-Volume diagram. Several parameters are derived which characterize performance capability of the modern steam car. The report illustrates that performance is dictated by the characteristics of the working medium, and the supply temperature. Performance is nearly independent of pressures above 800 psia. Analysis techniques were developed specifically for reciprocating steam engines suitable for automotive application. Specific performance charts have been constructed on the basis of water as a working medium. The conclusions and data interpretation are therefore limited within this scope.

  18. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    PubMed

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  19. Recommendations for Quantitative Analysis of Small Molecules by Matrix-assisted laser desorption ionization mass spectrometry

    PubMed Central

    Wang, Poguang; Giese, Roger W.

    2017-01-01

    Matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) has been used for quantitative analysis of small molecules for many years. It is usually preceded by an LC separation step when complex samples are tested. With the development several years ago of “modern MALDI” (automation, high repetition laser, high resolution peaks), the ease of use and performance of MALDI as a quantitative technique greatly increased. This review focuses on practical aspects of modern MALDI for quantitation of small molecules conducted in an ordinary way (no special reagents, devices or techniques for the spotting step of MALDI), and includes our ordinary, preferred Methods The review is organized as 18 recommendations with accompanying explanations, criticisms and exceptions. PMID:28118972

  20. Strategies for Fermentation Medium Optimization: An In-Depth Review

    PubMed Central

    Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.

    2017-01-01

    Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566

  1. Forensic and homeland security applications of modern portable Raman spectroscopy.

    PubMed

    Izake, Emad L

    2010-10-10

    Modern detection and identification of chemical and biological hazards within the forensic and homeland security contexts may well require conducting the analysis in field while adapting a non-contact approach to the hazard. Technological achievements on both surface and resonance enhancement Raman scattering re-developed Raman spectroscopy to become the most adaptable spectroscopy technique for stand-off and non-contact analysis of hazards. On the other hand, spatially offset Raman spectroscopy proved to be very valuable for non-invasive chemical analysis of hazards concealed within non-transparent containers and packaging. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  2. Neural data science: accelerating the experiment-analysis-theory cycle in large-scale neuroscience.

    PubMed

    Paninski, L; Cunningham, J P

    2018-06-01

    Modern large-scale multineuronal recording methodologies, including multielectrode arrays, calcium imaging, and optogenetic techniques, produce single-neuron resolution data of a magnitude and precision that were the realm of science fiction twenty years ago. The major bottlenecks in systems and circuit neuroscience no longer lie in simply collecting data from large neural populations, but also in understanding this data: developing novel scientific questions, with corresponding analysis techniques and experimental designs to fully harness these new capabilities and meaningfully interrogate these questions. Advances in methods for signal processing, network analysis, dimensionality reduction, and optimal control-developed in lockstep with advances in experimental neurotechnology-promise major breakthroughs in multiple fundamental neuroscience problems. These trends are clear in a broad array of subfields of modern neuroscience; this review focuses on recent advances in methods for analyzing neural time-series data with single-neuronal precision. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Application of enhanced modern structured analysis techniques to Space Station Freedom electric power system requirements

    NASA Technical Reports Server (NTRS)

    Biernacki, John; Juhasz, John; Sadler, Gerald

    1991-01-01

    A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.

  4. From experimental imaging techniques to virtual embryology.

    PubMed

    Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis

    2004-01-01

    Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.

  5. Applications of modern statistical methods to analysis of data in physical science

    NASA Astrophysics Data System (ADS)

    Wicker, James Eric

    Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.

  6. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    PubMed

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  7. CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages

    PubMed Central

    Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440

  8. Optics for Processes, Products and Metrology

    NASA Astrophysics Data System (ADS)

    Mather, George

    1999-04-01

    Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.

  9. Advances in the microrheology of complex fluids

    NASA Astrophysics Data System (ADS)

    Waigh, Thomas Andrew

    2016-07-01

    New developments in the microrheology of complex fluids are considered. Firstly the requirements for a simple modern particle tracking microrheology experiment are introduced, the error analysis methods associated with it and the mathematical techniques required to calculate the linear viscoelasticity. Progress in microrheology instrumentation is then described with respect to detectors, light sources, colloidal probes, magnetic tweezers, optical tweezers, diffusing wave spectroscopy, optical coherence tomography, fluorescence correlation spectroscopy, elastic- and quasi-elastic scattering techniques, 3D tracking, single molecule methods, modern microscopy methods and microfluidics. New theoretical techniques are also reviewed such as Bayesian analysis, oversampling, inversion techniques, alternative statistical tools for tracks (angular correlations, first passage probabilities, the kurtosis, motor protein step segmentation etc), issues in micro/macro rheological agreement and two particle methodologies. Applications where microrheology has begun to make some impact are also considered including semi-flexible polymers, gels, microorganism biofilms, intracellular methods, high frequency viscoelasticity, comb polymers, active motile fluids, blood clots, colloids, granular materials, polymers, liquid crystals and foods. Two large emergent areas of microrheology, non-linear microrheology and surface microrheology are also discussed.

  10. Atmospheric Chemistry and Transport from Space Observations

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark R.

    2002-01-01

    This lecture will cover the basic ideas of space observations of chemical constituents, modern analysis techniques and results. I will show analysis using TOMS, UARS, SAGE, Terra. I will show some of the planned missions for the US that will launch in the next few years.

  11. Loop shaping design for tracking performance in machine axes.

    PubMed

    Schinstock, Dale E; Wei, Zhouhong; Yang, Tao

    2006-01-01

    A modern interpretation of classical loop shaping control design methods is presented in the context of tracking control for linear motor stages. Target applications include noncontacting machines such as laser cutters and markers, water jet cutters, and adhesive applicators. The methods are directly applicable to the common PID controller and are pertinent to many electromechanical servo actuators other than linear motors. In addition to explicit design techniques a PID tuning algorithm stressing the importance of tracking is described. While the theory behind these techniques is not new, the analysis of their application to modern systems is unique in the research literature. The techniques and results should be important to control practitioners optimizing PID controller designs for tracking and in comparing results from classical designs to modern techniques. The methods stress high-gain controller design and interpret what this means for PID. Nothing in the methods presented precludes the addition of feedforward control methods for added improvements in tracking. Laboratory results from a linear motor stage demonstrate that with large open-loop gain very good tracking performance can be achieved. The resultant tracking errors compare very favorably to results from similar motions on similar systems that utilize much more complicated controllers.

  12. The Responses of Tenth-Grade Students to Four Novels.

    ERIC Educational Resources Information Center

    Grindstaff, Faye Louise

    To compare structural analysis with experiential reflective analysis as teaching techniques for literature, a study was made of the written responses of three groups of typical 10th-graders after reading four modern novels--Paul Annixter's "Swiftwater," Ray Bradbury's "Fahrenheit 451," Bel Kaufman's "Up the Down Staircase," and John…

  13. [Inheritance and evolution of acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times].

    PubMed

    Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao

    2015-05-01

    There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.

  14. Increasing the reliability of ecological models using modern software engineering techniques

    Treesearch

    Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff

    2009-01-01

    Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...

  15. [Discussion on the cultural loss and return of modern acupuncture].

    PubMed

    Liu, Bing; Zhao, Jing-sheng; Gao, Shu-zhong

    2009-08-01

    The philosophical ontology analysis was used in this study to explore the self-factors related to the cultural loss of modern acupuncture, and to establish the theoretical constructs and the clinical model for the cultural return. It is indicated that the most important factors related to the cultural loss of modern acupuncture are the separation of technical characteristics and cultural connotations and the diversion of modern techniques away from classical acupuncture. An effective way of the cultural return is to build a harmonious theoretical and clinical model to develop acupuncture. Based on the foundation of acupuncture from its own culture roots, the traditional sense and cultural values should be enhanced to facilitate the cultural return of acupuncture in theory and clinical practice.

  16. Modern Radiation Therapy for Nodal Non-Hodgkin Lymphoma—Target Definition and Dose Guidelines From the International Lymphoma Radiation Oncology Group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Illidge, Tim, E-mail: Tim.Illidge@ics.manchester.ac.uk; Specht, Lena; Yahalom, Joachim

    2014-05-01

    Radiation therapy (RT) is the most effective single modality for local control of non-Hodgkin lymphoma (NHL) and is an important component of therapy for many patients. Many of the historic concepts of dose and volume have recently been challenged by the advent of modern imaging and RT planning tools. The International Lymphoma Radiation Oncology Group (ILROG) has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the ILROG steering committee on the use of RT in NHL in the modern era. The roles of reduced volume and reduced doses aremore » addressed, integrating modern imaging with 3-dimensional planning and advanced techniques of RT delivery. In the modern era, in which combined-modality treatment with systemic therapy is appropriate, the previously applied extended-field and involved-field RT techniques that targeted nodal regions have now been replaced by limiting the RT to smaller volumes based solely on detectable nodal involvement at presentation. A new concept, involved-site RT, defines the clinical target volume. For indolent NHL, often treated with RT alone, larger fields should be considered. Newer treatment techniques, including intensity modulated RT, breath holding, image guided RT, and 4-dimensional imaging, should be implemented, and their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control.« less

  17. Introduction to Time Series Analysis

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.

    1986-01-01

    The field of time series analysis is explored from its logical foundations to the most modern data analysis techniques. The presentation is developed, as far as possible, for continuous data, so that the inevitable use of discrete mathematics is postponed until the reader has gained some familiarity with the concepts. The monograph seeks to provide the reader with both the theoretical overview and the practical details necessary to correctly apply the full range of these powerful techniques. In addition, the last chapter introduces many specialized areas where research is currently in progress.

  18. Injuries in students of three different dance techniques.

    PubMed

    Echegoyen, Soledad; Acuña, Eugenia; Rodríguez, Cristina

    2010-06-01

    As with any athlete, the dancer has a high risk for injury. Most studies carried out relate to classical and modern dance; however, there is a lack of reports on injuries involving other dance techniques. This study is an attempt to determine the differences in the incidence, the exposure-related rates, and the kind of injuries in three different dance techniques. A prospective study about dance injuries was carried out between 2004 and 2007 on students of modern, Mexican folkloric, and Spanish dance at the Escuela Nacional de Danza. A total of 1,168 injuries were registered in 444 students; the injury rate was 4 injuries/student for modern dance and 2 injuries/student for Mexican folkloric and Spanish dance. The rate per training hours was 4 for modern, 1.8 for Mexican folkloric, and 1.5 injuries/1,000 hr of training for Spanish dance. The lower extremity is the most frequent structure injured (70.47%), and overuse injuries comprised 29% of the total. The most frequent injuries were strain, sprain, back pain, and patellofemoral pain. This study has a consistent medical diagnosis of the injuries and is the first attempt in Mexico to compare the incidence of injuries in different dance techniques. To decrease the frequency of student injury, it is important to incorporate prevention programs into dance program curricula. More studies are necessary to define causes and mechanisms of injury, as well as an analysis of training methodology, to decrease the incidence of the muscle imbalances resulting in injury.

  19. The identification of synthetic organic pigments in modern paints and modern paintings using pyrolysis-gas chromatography-mass spectrometry.

    PubMed

    Russell, Joanna; Singer, Brian W; Perry, Justin J; Bacon, Anne

    2011-05-01

    A collection of more than 70 synthetic organic pigments were analysed using pyrolysis-gas chromatography-mass spectrometry (Py-GC-MS). We report on the analysis of diketo-pyrrolo-pyrrole, isoindolinone and perylene pigments which are classes not previously reported as being analysed by this technique. We also report on a number of azo pigments (2-naphthol, naphthol AS, arylide, diarylide, benzimidazolone and disazo condensation pigments) and phthalocyanine pigments, the Py-GC-MS analysis of which has not been previously reported. The members of each class were found to fragment in a consistent way and the pyrolysis products are reported. The technique was successfully applied to the analysis of paints used by the artist Francis Bacon (1909-1992), to simultaneously identify synthetic organic pigments and synthetic binding media in two samples of paint taken from Bacon's studio and micro-samples taken from three of his paintings and one painting attributed to him.

  20. Space Suit Performance: Methods for Changing the Quality of Quantitative Data

    NASA Technical Reports Server (NTRS)

    Cowley, Matthew; Benson, Elizabeth; Rajulu, Sudhakar

    2014-01-01

    NASA is currently designing a new space suit capable of working in deep space and on Mars. Designing a suit is very difficult and often requires trade-offs between performance, cost, mass, and system complexity. To verify that new suits will enable astronauts to perform to their maximum capacity, prototype suits must be built and tested with human subjects. However, engineers and flight surgeons often have difficulty understanding and applying traditional representations of human data without training. To overcome these challenges, NASA is developing modern simulation and analysis techniques that focus on 3D visualization. Early understanding of actual performance early on in the design cycle is extremely advantageous to increase performance capabilities, reduce the risk of injury, and reduce costs. The primary objective of this project was to test modern simulation and analysis techniques for evaluating the performance of a human operating in extra-vehicular space suits.

  1. Using the Mini-Session Course Format to Train Students in the Practical Aspects of Modern Mass Spectrometry

    ERIC Educational Resources Information Center

    Rosado, Dale A., Jr.; Masterson, Tina S.; Masterson, Douglas S.

    2011-01-01

    Mass spectrometry (MS) has been gaining in popularity in recent years owing in large part to the development of soft-ionization techniques such as matrix-assisted laser desorption ionization (MALDI) and electrospray ionization (ESI). These soft-ionization techniques have opened up the field of MS analysis to biomolecules, polymers, and other high…

  2. Market Analysis. What Is It? How Does It Fit into Comprehensive Institutional Planning?

    ERIC Educational Resources Information Center

    Groff, Warren

    The basic principles of market analysis are examined in this paper especially as they relate to institutional planning. Introductory material presents background information, including: (1) a description of two projects undertaken to implement modern management techniques at small colleges; (2) an examination of three marketing philosophies; and…

  3. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  4. Maximizing the U.S. Army’s Future Contribution to Global Security Using the Capability Portfolio Analysis Tool (CPAT)

    DOE PAGES

    Davis, Scott J.; Edwards, Shatiel B.; Teper, Gerald E.; ...

    2016-02-01

    We report that recent budget reductions have posed tremendous challenges to the U.S. Army in managing its portfolio of ground combat systems (tanks and other fighting vehicles), thus placing many important programs at risk. To address these challenges, the Army and a supporting team developed and applied the Capability Portfolio Analysis Tool (CPAT) to optimally invest in ground combat modernization over the next 25–35 years. CPAT provides the Army with the analytical rigor needed to help senior Army decision makers allocate scarce modernization dollars to protect soldiers and maintain capability overmatch. CPAT delivers unparalleled insight into multiple-decade modernization planning usingmore » a novel multiphase mixed-integer linear programming technique and illustrates a cultural shift toward analytics in the Army’s acquisition thinking and processes. CPAT analysis helped shape decisions to continue modernization of the $10 billion Stryker family of vehicles (originally slated for cancellation) and to strategically reallocate over $20 billion to existing modernization programs by not pursuing the Ground Combat Vehicle program as originally envisioned. Ultimately, more than 40 studies have been completed using CPAT, applying operations research methods to optimally prioritize billions of taxpayer dollars and allowing Army acquisition executives to base investment decisions on analytically rigorous evaluations of portfolio trade-offs.« less

  5. Introduction to Modern Methods in Light Microscopy.

    PubMed

    Ryan, Joel; Gerhold, Abby R; Boudreau, Vincent; Smith, Lydia; Maddox, Paul S

    2017-01-01

    For centuries, light microscopy has been a key method in biological research, from the early work of Robert Hooke describing biological organisms as cells, to the latest in live-cell and single-molecule systems. Here, we introduce some of the key concepts related to the development and implementation of modern microscopy techniques. We briefly discuss the basics of optics in the microscope, super-resolution imaging, quantitative image analysis, live-cell imaging, and provide an outlook on active research areas pertaining to light microscopy.

  6. A pilot modeling technique for handling-qualities research

    NASA Technical Reports Server (NTRS)

    Hess, R. A.

    1980-01-01

    A brief survey of the more dominant analysis techniques used in closed-loop handling-qualities research is presented. These techniques are shown to rely on so-called classical and modern analytical models of the human pilot which have their foundation in the analysis and design principles of feedback control. The optimal control model of the human pilot is discussed in some detail and a novel approach to the a priori selection of pertinent model parameters is discussed. Frequency domain and tracking performance data from 10 pilot-in-the-loop simulation experiments involving 3 different tasks are used to demonstrate the parameter selection technique. Finally, the utility of this modeling approach in handling-qualities research is discussed.

  7. What can comparative effectiveness research, propensity score and registry study bring to Chinese medicine?

    PubMed

    Liao, Xing; Xie, Yan-ming

    2014-10-01

    The impact of evidence-based medicine and clinical epidemiology on clinical research has contributed to the development of Chinese medicine in modern times over the past two decades. Many concepts and methods of modern science and technology are emerging in Chinese medicine research, resulting in constant progress. Systematic reviews, randomized controlled trials and other advanced mathematic approaches and statistical analysis methods have brought reform to Chinese medicine. In this new era, Chinese medicine researchers have many opportunities and challenges. On the one hand, Chinese medicine researchers need to dedicate themselves to providing enough evidence to the world through rigorous studies, whilst on the other hand, they also need to keep up with the speed of modern medicine research. For example, recently, real world study, comparative effectiveness research, propensity score techniques and registry study have emerged. This article aims to inspire Chinese medicine researchers to explore new areas by introducing these new ideas and new techniques.

  8. Current role of modern radiotherapy techniques in the management of breast cancer

    PubMed Central

    Ozyigit, Gokhan; Gultekin, Melis

    2014-01-01

    Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857

  9. An Information-Systems Program for the Language Sciences. Final Report on Survey-and-Analysis Stage, 1967-1968.

    ERIC Educational Resources Information Center

    Freeman, Robert R.; And Others

    The main results of the survey-and-analysis stage include a substantial collection of preliminary data on the language-sciences information user community, its professional specialties and information channels, its indexing tools, and its terminologies. The prospects and techniques for the development of a modern, discipline-based information…

  10. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.

  11. Photovoltaic power system reliability considerations

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.

    1980-01-01

    This paper describes an example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems. This particular application was for a solar cell power system demonstration project in Tangaye, Upper Volta, Africa. The techniques involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of a fail-safe and planned spare parts engineering philosophy.

  12. Trends of Modern Contraceptive Use among Young Married Women Based on the 2000, 2005, and 2011 Ethiopian Demographic and Health Surveys: A Multivariate Decomposition Analysis

    PubMed Central

    Worku, Abebaw Gebeyehu; Tessema, Gizachew Assefa; Zeleke, Atinkut Alamirrew

    2015-01-01

    Introduction Accessing family planning can reduce a significant proportion of maternal, infant, and childhood deaths. In Ethiopia, use of modern contraceptive methods is low but it is increasing. This study aimed to analyze the trends and determinants of changes in modern contraceptive use over time among young married women in Ethiopia. Methods The study used data from the three Demographic Health Surveys conducted in Ethiopia, in 2000, 2005, and 2011. Young married women age 15–24 years with sample sizes of 2,157 in 2000, 1,904 in 2005, and 2,146 in 2011 were included. Logit-based decomposition analysis technique was used for analysis of factors contributing to the recent changes. STATA 12 was employed for data management and analyses. All calculations presented in this paper were weighted for the sampling probabilities and non-response. Complex sampling procedures were also considered during testing of statistical significance. Results Among young married women, modern contraceptive prevalence increased from 6% in 2000 to 16% in 2005 and to 36% in 2011. The decomposition analysis indicated that 34% of the overall change in modern contraceptive use was due to difference in women’s characteristics. Changes in the composition of young women’s characteristics according to age, educational status, religion, couple concordance on family size, and fertility preference were the major sources of this increase. Two-thirds of the increase in modern contraceptive use was due to difference in coefficients. Most importantly, the increase was due to change in contraceptive use behavior among the rural population (33%) and among Orthodox Christians (16%) and Protestants (4%). Conclusions Modern contraceptive use among young married women has showed a remarkable increase over the last decade in Ethiopia. Programmatic interventions targeting poor, younger (adolescent), illiterate, and Muslim women would help to maintain the increasing trend in modern contraceptive use. PMID:25635389

  13. Four Bad Habits of Modern Psychologists

    PubMed Central

    Grice, James; Cota, Lisa; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam

    2017-01-01

    Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., “bad habits”) of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers’ theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices. PMID:28805739

  14. Four Bad Habits of Modern Psychologists.

    PubMed

    Grice, James; Barrett, Paul; Cota, Lisa; Felix, Crystal; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam

    2017-08-14

    Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., "bad habits") of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers' theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices.

  15. "To Educate Children from Birth": A Genealogical Analysis of Some Practices of Subjectivation in Spanish and French Scientific Childcare (1898-1939)

    ERIC Educational Resources Information Center

    Jiménez-Alonso, Belén; Loredo-Narciandi, José Carlos

    2016-01-01

    The aim of this paper is to analyse certain techniques of subjectivation in modern child-rearing and the way in which medical discourse leads to the construction of children through those techniques. As a case study, several manuals on childcare used during the first third of the twentieth century in Spain and France have been selected. A…

  16. How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods

    DTIC Science & Technology

    2007-08-01

    Attack Trees for Modeling and Analysis 10 2.8 Misuse and Abuse Cases 10 2.9 Formal Methods 11 2.9.1 Software Cost Reduction 12 2.9.2 Common...modern or efficient techniques. • Requirements analysis typically is either not performed at all (identified requirements are directly specified without...any analysis or modeling) or analysis is restricted to functional re- quirements and ignores quality requirements, other nonfunctional requirements

  17. Design techniques for low-voltage analog integrated circuits

    NASA Astrophysics Data System (ADS)

    Rakús, Matej; Stopjaková, Viera; Arbet, Daniel

    2017-08-01

    In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.

  18. Atmospheric aerosols: A literature summary of their physical characteristics and chemical composition

    NASA Technical Reports Server (NTRS)

    Harris, F. S., Jr.

    1976-01-01

    This report contains a summary of 199 recent references on the characterization of atmospheric aerosols with respect to their composition, sources, size distribution, and time changes, and with particular reference to the chemical elements measured by modern techniques, especially activation analysis.

  19. Modern separation techniques coupled to high performance mass spectrometry for glycolipid analysis.

    PubMed

    Sarbu, Mirela; Zamfir, Alina Diana

    2018-01-21

    Glycolipids (GLs), involved in biological processes and pathologies, such as viral, neurodegenerative and oncogenic transformations are in the focus of research related to method development for structural analysis. This review highlights modern separation techniques coupled to mass spectrometry (MS) for the investigation of GLs from various biological matrices. First section is dedicated to methods, which, although provide the separation in a non-liquid phase, are able to supply important data on the composition of complex mixtures. While classical thin layer chromatography (TLC) is useful for MS analyses of the fractionated samples, ultramodern ion mobility (IMS) characterized by high reproducibility facilitates to discover minor species and to apply low sample amounts, in addition to providing conformational separation with isomer discrimination. Second section highlights the advantages, applications and limitations of liquid-based separation techniques such as high performance liquid chromatography (HPLC) and hydrophilic interaction liquid chromatography (HILIC) in direct or indirect coupling to MS for glycolipidomics surveys. The on- and off-line capillary electrophoresis (CE) MS, offering a remarkable separation efficiency of GLs is also presented and critically assessed from the technical and application perspective in the final part of the review. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Image analysis in modern ophthalmology: from acquisition to computer assisted diagnosis and telemedicine

    NASA Astrophysics Data System (ADS)

    Marrugo, Andrés G.; Millán, María S.; Cristóbal, Gabriel; Gabarda, Salvador; Sorel, Michal; Sroubek, Filip

    2012-06-01

    Medical digital imaging has become a key element of modern health care procedures. It provides visual documentation and a permanent record for the patients, and most important the ability to extract information about many diseases. Modern ophthalmology thrives and develops on the advances in digital imaging and computing power. In this work we present an overview of recent image processing techniques proposed by the authors in the area of digital eye fundus photography. Our applications range from retinal image quality assessment to image restoration via blind deconvolution and visualization of structural changes in time between patient visits. All proposed within a framework for improving and assisting the medical practice and the forthcoming scenario of the information chain in telemedicine.

  1. Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)

    ERIC Educational Resources Information Center

    Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.

    2016-01-01

    The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…

  2. Testing Web Applications with Mutation Analysis

    ERIC Educational Resources Information Center

    Praphamontripong, Upsorn

    2017-01-01

    Web application software uses new technologies that have novel methods for integration and state maintenance that amount to new control flow mechanisms and new variables scoping. While modern web development technologies enhance the capabilities of web applications, they introduce challenges that current testing techniques do not adequately test…

  3. Prehistoric Iroquois Medicine

    ERIC Educational Resources Information Center

    Hosbach, Richard E.; Doyle, Robert E.

    1976-01-01

    Study of pre-1750 medicine reveals that Iroquois diagnosis and treatment of disease was more advanced than the medicine of their European counterparts. The Iroquois developed a cure for scurvy, treated hypertension, and head lice, and even designed sauna baths. Indian psychiatry also included modern day techniques such as dream analysis. (MR)

  4. Double Density Dual Tree Discrete Wavelet Transform implementation for Degraded Image Enhancement

    NASA Astrophysics Data System (ADS)

    Vimala, C.; Aruna Priya, P.

    2018-04-01

    Wavelet transform is a main tool for image processing applications in modern existence. A Double Density Dual Tree Discrete Wavelet Transform is used and investigated for image denoising. Images are considered for the analysis and the performance is compared with discrete wavelet transform and the Double Density DWT. Peak Signal to Noise Ratio values and Root Means Square error are calculated in all the three wavelet techniques for denoised images and the performance has evaluated. The proposed techniques give the better performance when comparing other two wavelet techniques.

  5. Multidimensional Analysis of Nuclear Detonations

    DTIC Science & Technology

    2015-09-17

    Features on the nuclear weapons testing films because of the expanding and emissive nature of the nuclear fireball. The use of these techniques to produce...Treaty (New Start Treaty) have reduced the acceptable margins of error. Multidimensional analysis provides the modern approach to nuclear weapon ...scientific community access to the information necessary to expand upon the knowledge of nuclear weapon effects. This data set has the potential to provide

  6. Memory Forensics: Review of Acquisition and Analysis Techniques

    DTIC Science & Technology

    2013-11-01

    Management Overview Processes running on modern multitasking operating systems operate on an abstraction of RAM, called virtual memory [7]. In these systems...information such as user names, email addresses and passwords [7]. Analysts also use tools such as WinHex to identify headers or other suspicious data within

  7. Developing and Assessing E-Learning Techniques for Teaching Forecasting

    ERIC Educational Resources Information Center

    Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian

    2014-01-01

    In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…

  8. Analysis of Autopilot Behavior

    NASA Technical Reports Server (NTRS)

    Sherry, Lance; Polson, Peter; Feay, Mike; Palmer, Everett; Null, Cynthia H. (Technical Monitor)

    1998-01-01

    Aviation and cognitive science researchers have identified situations in which the pilot's expectations for behavior of autopilot avionics are not matched by the actual behavior of the avionics. These "automation surprises" have been attributed to differences between the pilot's model of the behavior of the avionics and the actual behavior encoded in the avionics software. A formal technique is described for the analysis and measurement of the behavior of the cruise pitch modes of a modern Autopilot. The analysis characterizes the behavior of the Autopilot as situation-action rules. The behavior of the cruise pitch mode logic for a contemporary modern Autopilot was found to include 177 rules, including Level Change (23), Vertical Speed (16), Altitude Capture (50), and Altitude Hold (88). These rules are determined based on the values of 62 inputs. Analysis of the rule-based model also shed light on the factors cited in the literature as contributors to "automation surprises."

  9. Introduction. [usefulness of modern remote sensing techniques for studying components of California water resources

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1973-01-01

    Since May 1970, personnel on several campuses of the University of California have been conducting investigations which seek to determine the usefulness of modern remote sensing techniques for studying various components of California's earth resources complex. Emphasis has been given to California's water resources as exemplified by the Feather River project and other aspects of the California Water Plan. This study is designed to consider in detail the supply, demand, and impact relationships. The specific geographic areas studied are the Feather River drainage in northern California, the Chino-Riverside Basin and Imperial Valley areas in southern California, and selected portions of the west side of San Joaquin Valley in central California. An analysis is also given on how an effective benefit-cost study of remote sensing in relation to California's water resources might best be made.

  10. Flexible use and technique extension of logistics management

    NASA Astrophysics Data System (ADS)

    Xiong, Furong

    2011-10-01

    As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.

  11. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    PubMed

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  12. Evaluation of portable Raman spectroscopy and handheld X-ray fluorescence analysis (hXRF) for the direct analysis of glyptics

    NASA Astrophysics Data System (ADS)

    Lauwers, D.; Candeias, A.; Coccato, A.; Mirao, J.; Moens, L.; Vandenabeele, P.

    2016-03-01

    In archaeometry, the advantages of a combined use of Raman spectroscopy and X-ray fluorescence spectroscopy are extensively discussed for applications such as the analysis of paintings, manuscripts, pottery, etc. Here, we demonstrate for the first time the advantage of using both techniques for analysing glyptics. These engraved gemstones or glass materials were originally used as stamps, to identify the owner, for instance on letters, but also on wine vessels. For this research, a set of 64 glyptics (42 Roman glass specimens and 22 modern ones), belonging to the collection of the museum 'Quinta das Cruzes' in Funchal (Madeira, Portugal), was analysed with portable Raman spectroscopy and handheld X-ray fluorescence (hXRF). These techniques were also used to confirm the gemological identification of these precious objects and can give extra information about the glass composition. Raman spectroscopy identifies the molecular composition as well as on the crystalline phases present. On the other hand, hXRF results show that the antique Roman glass samples are characterised with low Pb and Sn levels and that the modern specimens can be discriminated in two groups: lead-based and non-lead-based ones.

  13. A tale of two species: neural integration in zebrafish and monkeys

    PubMed Central

    Joshua, Mati; Lisberger, Stephen G.

    2014-01-01

    Selection of a model organism creates a tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called “neural integration” that occurs in the brainstems of larval zebrafish, non-human primates, and species “in between”. We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for the details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. PMID:24797331

  14. A tale of two species: Neural integration in zebrafish and monkeys.

    PubMed

    Joshua, M; Lisberger, S G

    2015-06-18

    Selection of a model organism creates tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called "neural integration" that occurs in the brainstems of larval zebrafish, primates, and species "in between". We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  15. Portable XRF and principal component analysis for bill characterization in forensic science.

    PubMed

    Appoloni, C R; Melquiades, F L

    2014-02-01

    Several modern techniques have been applied to prevent counterfeiting of money bills. The objective of this study was to demonstrate the potential of Portable X-ray Fluorescence (PXRF) technique and the multivariate analysis method of Principal Component Analysis (PCA) for classification of bills in order to use it in forensic science. Bills of Dollar, Euro and Real (Brazilian currency) were measured directly at different colored regions, without any previous preparation. Spectra interpretation allowed the identification of Ca, Ti, Fe, Cu, Sr, Y, Zr and Pb. PCA analysis separated the bills in three groups and subgroups among Brazilian currency. In conclusion, the samples were classified according to its origin identifying the elements responsible for differentiation and basic pigment composition. PXRF allied to multivariate discriminate methods is a promising technique for rapid and no destructive identification of false bills in forensic science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Research study on stabilization and control: Modern sampled data control theory

    NASA Technical Reports Server (NTRS)

    Kuo, B. C.; Singh, G.; Yackel, R. A.

    1973-01-01

    A numerical analysis of spacecraft stability parameters was conducted. The analysis is based on a digital approximation by point by point state comparison. The technique used is that of approximating a continuous data system by a sampled data model by comparison of the states of the two systems. Application of the method to the digital redesign of the simplified one axis dynamics of the Skylab is presented.

  17. A Self-Instructional Approach To the Teaching of Enzymology Involving Computer-Based Sequence Analysis and Molecular Modelling.

    ERIC Educational Resources Information Center

    Attwood, Paul V.

    1997-01-01

    Describes a self-instructional assignment approach to the teaching of advanced enzymology. Presents an assignment that offers a means of teaching enzymology to students that exposes them to modern computer-based techniques of analyzing protein structure and relates structure to enzyme function. (JRH)

  18. User’s guide to SNAP for ArcGIS® :ArcGIS interface for scheduling and network analysis program

    Treesearch

    Woodam Chung; Dennis Dykstra; Fred Bower; Stephen O’Brien; Richard Abt; John. and Sessions

    2012-01-01

    This document introduces a computer software named SNAP for ArcGIS® , which has been developed to streamline scheduling and transportation planning for timber harvest areas. Using modern optimization techniques, it can be used to spatially schedule timber harvest with consideration of harvesting costs, multiple products, alternative...

  19. A Laboratory Experiment for Rapid Determination of the Stability of Vitamin C

    ERIC Educational Resources Information Center

    Adem, Seid M.; Lueng, Sam H.; Elles, Lisa M. Sharpe; Shaver, Lee Alan

    2016-01-01

    Experiments in laboratory manuals intended for general, organic, and biological (GOB) chemistry laboratories include few opportunities for students to engage in instrumental methods of analysis. Many of these students seek careers in modern health-related fields where experience in spectroscopic techniques would be beneficial. A simple, rapid,…

  20. State of the art in treatment of facial paralysis with temporalis tendon transfer.

    PubMed

    Sidle, Douglas M; Simon, Patrick

    2013-08-01

    Temporalis tendon transfer is a technique for dynamic facial reanimation. Since its inception, nearly 80 years ago, it has undergone a wealth of innovation to produce the modern operation. The purpose of this review is to update the literature as to the current techniques and perioperative management of patients undergoing temporalis tendon transfer. The modern technique focuses on the minimally invasive approaches and aesthetic refinements to enhance the final product of the operation. The newest techniques as well as preoperative assessment and postoperative rehabilitation are discussed. When temporalis tendon transfer is indicated for facial reanimation, the modern operation offers a refined technique that produces an aesthetically acceptable outcome. Preoperative smile assessment and postoperative smile rehabilitation are necessary and are important adjuncts to a successful operation.

  1. Coal thickness gauge using RRAS techniques, part 1. [radiofrequency resonance absorption

    NASA Technical Reports Server (NTRS)

    Rollwitz, W. L.; King, J. D.

    1978-01-01

    A noncontacting sensor having a measurement range of 0 to 6 in or more, and with an accuracy of 0.5 in or better is needed to control the machinery used in modern coal mining so that the thickness of the coal layer remaining over the rock is maintained within selected bounds. The feasibility of using the radiofrequency resonance absorption (RRAS) techniques of electron magnetic resonance (EMR) and nuclear magnetic resonance (NMR) as the basis of a coal thickness gauge is discussed. The EMR technique was found, by analysis and experiments, to be well suited for this application.

  2. Computer literacy enhancement in the Teaching Hospital Olomouc. Part I: project management techniques. Short communication.

    PubMed

    Sedlár, Drahomír; Potomková, Jarmila; Rehorová, Jarmila; Seckár, Pavel; Sukopová, Vera

    2003-11-01

    Information explosion and globalization make great demands on keeping pace with the new trends in the healthcare sector. The contemporary level of computer and information literacy among most health care professionals in the Teaching Hospital Olomouc (Czech Republic) is not satisfactory for efficient exploitation of modern information technology in diagnostics, therapy and nursing. The present contribution describes the application of two basic problem solving techniques (brainstorming, SWOT analysis) to develop a project aimed at information literacy enhancement.

  3. Modern Hardware Technologies and Software Techniques for On-Line Database Storage and Access.

    DTIC Science & Technology

    1985-12-01

    of the information in a message narrative. This method employs artificial intelligence techniques to extract information, In simalest terms, an...disf ribif ion (tape replacemenf) systemns Database distribution On-fine mass storage Videogame ROM (luke-box I Media Cost Mt $2-10/438 $10-SO/G38...trajninq ot tne great intelligence for the analyst would be required. If, on’ the other hand, a sentence analysis scneme siTole enouq,. for the low-level

  4. Considerations for monitoring raptor population trends based on counts of migrants

    USGS Publications Warehouse

    Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.

    1989-01-01

    Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.

  5. Digital image processing for information extraction.

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1973-01-01

    The modern digital computer has made practical image processing techniques for handling nonlinear operations in both the geometrical and the intensity domains, various types of nonuniform noise cleanup, and the numerical analysis of pictures. An initial requirement is that a number of anomalies caused by the camera (e.g., geometric distortion, MTF roll-off, vignetting, and nonuniform intensity response) must be taken into account or removed to avoid their interference with the information extraction process. Examples illustrating these operations are discussed along with computer techniques used to emphasize details, perform analyses, classify materials by multivariate analysis, detect temporal differences, and aid in human interpretation of photos.

  6. Extracting elastic properties of an atomically thin interfacial layer by time-domain analysis of femtosecond acoustics

    NASA Astrophysics Data System (ADS)

    Chen, H.-Y.; Huang, Y.-R.; Shih, H.-Y.; Chen, M.-J.; Sheu, J.-K.; Sun, C.-K.

    2017-11-01

    Modern devices adopting denser designs and complex 3D structures have created much more interfaces than before, where atomically thin interfacial layers could form. However, fundamental information such as the elastic property of the interfacial layers is hard to measure. The elastic property of the interfacial layer is of great importance in both thermal management and nano-engineering of modern devices. Appropriate techniques to probe the elastic properties of interfacial layers as thin as only several atoms are thus critically needed. In this work, we demonstrated the feasibility of utilizing the time-resolved femtosecond acoustics technique to extract the elastic properties and mass density of a 1.85-nm-thick interfacial layer, with the aid of transmission electron microscopy. We believe that this femtosecond acoustics approach will provide a strategy to measure the absolute elastic properties of atomically thin interfacial layers.

  7. Nanoscale surface characterization using laser interference microscopy

    NASA Astrophysics Data System (ADS)

    Ignatyev, Pavel S.; Skrynnik, Andrey A.; Melnik, Yury A.

    2018-03-01

    Nanoscale surface characterization is one of the most significant parts of modern materials development and application. The modern microscopes are expensive and complicated tools, and its use for industrial tasks is limited due to laborious sample preparation, measurement procedures, and low operation speed. The laser modulation interference microscopy method (MIM) for real-time quantitative and qualitative analysis of glass, metals, ceramics, and various coatings has a spatial resolution of 0.1 nm for vertical and up to 100 nm for lateral. It is proposed as an alternative to traditional scanning electron microscopy (SEM) and atomic force microscopy (AFM) methods. It is demonstrated that in the cases of roughness metrology for super smooth (Ra >1 nm) surfaces the application of a laser interference microscopy techniques is more optimal than conventional SEM and AFM. The comparison of semiconductor test structure for lateral dimensions measurements obtained with SEM and AFM and white light interferometer also demonstrates the advantages of MIM technique.

  8. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  9. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  10. A performance model for GPUs with caches

    DOE PAGES

    Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...

    2014-06-24

    To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less

  11. High-performance liquid chromatography coupled with tandem mass spectrometry technology in the analysis of Chinese Medicine Formulas: A bibliometric analysis (1997-2015).

    PubMed

    He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping

    2017-01-01

    There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Foodomics: MS-based strategies in modern food science and nutrition.

    PubMed

    Herrero, Miguel; Simó, Carolina; García-Cañas, Virginia; Ibáñez, Elena; Cifuentes, Alejandro

    2012-01-01

    Modern research in food science and nutrition is moving from classical methodologies to advanced analytical strategies in which MS-based techniques play a crucial role. In this context, Foodomics has been recently defined as a new discipline that studies food and nutrition domains through the application of advanced omics technologies in which MS techniques are considered indispensable. Applications of Foodomics include the genomic, transcriptomic, proteomic, and/or metabolomic study of foods for compound profiling, authenticity, and/or biomarker-detection related to food quality or safety; the development of new transgenic foods, food contaminants, and whole toxicity studies; new investigations on food bioactivity, food effects on human health, etc. This review work does not intend to provide an exhaustive revision of the many works published so far on food analysis using MS techniques. The aim of the present work is to provide an overview of the different MS-based strategies that have been (or can be) applied in the new field of Foodomics, discussing their advantages and drawbacks. Besides, some ideas about the foreseen development and applications of MS-techniques in this new discipline are also provided. Copyright © 2011 Wiley Periodicals, Inc.

  13. Chemical composition and speciation of particulate organic matter from modern residential small-scale wood combustion appliances.

    PubMed

    Czech, Hendryk; Miersch, Toni; Orasche, Jürgen; Abbaszade, Gülcin; Sippula, Olli; Tissari, Jarkko; Michalke, Bernhard; Schnelle-Kreis, Jürgen; Streibel, Thorsten; Jokiniemi, Jorma; Zimmermann, Ralf

    2018-01-15

    Combustion technologies of small-scale wood combustion appliances are continuously developed decrease emissions of various pollutants and increase energy conversion. One strategy to reduce emissions is the implementation of air staging technology in secondary air supply, which became an established technique for modern wood combustion appliances. On that account, emissions from a modern masonry heater fuelled with three types of common logwood (beech, birch and spruce) and a modern pellet boiler fuelled with commercial softwood pellets were investigated, which refer to representative combustion appliances in northern Europe In particular, emphasis was put on the organic constituents of PM2.5, including polycyclic aromatic hydrocarbons (PAHs), oxygenated PAHs (OPAHs) and phenolic species, by targeted and non-targeted mass spectrometric analysis techniques. Compared to conventional wood stoves and pellet boilers, organic emissions from the modern appliances were reduced by at least one order of magnitude, but to a different extent for single species. Hence, characteristic ratios of emission constituents and emission profiles for wood combustion identification and speciation do not hold for this type of advanced combustion technology. Additionally, an overall substantial reduction of typical wood combustion markers, such as phenolic species and anhydrous sugars, were observed. Finally, it was found that slow ignition of log woods changes the distribution of characteristic resin acids and phytosterols as well as their thermal alteration products, which are used as markers for specific wood types. Our results should be considered for wood combustion identification in positive matrix factorisation or chemical mass balance in northern Europe. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Analysis of Slabs-on-Grade for a Variety of Loading and Support Conditions.

    DTIC Science & Technology

    1984-12-01

    applications, namely "- the problem of a slab-on-grade, as encountered in the analysis and design of rigid pavements. - ". This is one of the few...proper design and construction methods are adhered to. There are several additional reasons, entirely due to recent developments, that warrant the...conservative designs led to almost imperceptible pavement deformations, thus warranting the term "rigid pavements". Modern-day analytical techniques

  15. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-07-01

    "Prediflood" is a database of historical floods occurred in Catalonia (NE Iberian Peninsula), between 10th Century and 21th Century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, apt to be used in multidisciplinary flood analysis techniques, as meteorological or hydraullic reconstructions.

  16. Nonequilibrium flow computations. 1: An analysis of numerical formulations of conservation laws

    NASA Technical Reports Server (NTRS)

    Liu, Yen; Vinokur, Marcel

    1988-01-01

    Modern numerical techniques employing properties of flux Jacobian matrices are extended to general, nonequilibrium flows. Generalizations of the Beam-Warming scheme, Steger-Warming and van Leer Flux-vector splittings, and Roe's approximate Riemann solver are presented for 3-D, time-varying grids. The analysis is based on a thermodynamic model that includes the most general thermal and chemical nonequilibrium flow of an arbitrary gas. Various special cases are also discussed.

  17. Pricing the Future in the Seventeenth Century: Calculating Technologies in Competition.

    PubMed

    Deringer, William

    Time is money. But how much? What is money in the future worth to you today? This question of "present value" arises in myriad economic activities, from valuing financial securities to real estate transactions to governmental cost-benefit analysis-even the economics of climate change. In modern capitalist practice, one calculation offers the only "rational" way to answer: compound-interest discounting. In the early modern period, though, economic actors used at least two alternative calculating technologies for thinking about present value, including a vernacular technique called years purchase and discounting by simple interest. All of these calculations had different strengths and affordances, and none was unquestionably better or more "rational" than the others at the time. The history of technology offers distinct resources for understanding such technological competitions, and thus for understanding the emergence of modern economic temporality.

  18. Modern modelling techniques are data hungry: a simulation study for predicting dichotomous endpoints.

    PubMed

    van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W

    2014-12-22

    Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.

  19. Current trends in sample preparation for cosmetic analysis.

    PubMed

    Zhong, Zhixiong; Li, Gongke

    2017-01-01

    The widespread applications of cosmetics in modern life make their analysis particularly important from a safety point of view. There is a wide variety of restricted ingredients and prohibited substances that primarily influence the safety of cosmetics. Sample preparation for cosmetic analysis is a crucial step as the complex matrices may seriously interfere with the determination of target analytes. In this review, some new developments (2010-2016) in sample preparation techniques for cosmetic analysis, including liquid-phase microextraction, solid-phase microextraction, matrix solid-phase dispersion, pressurized liquid extraction, cloud point extraction, ultrasound-assisted extraction, and microwave digestion, are presented. Furthermore, the research and progress in sample preparation techniques and their applications in the separation and purification of allowed ingredients and prohibited substances are reviewed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A volumetric technique for fossil body mass estimation applied to Australopithecus afarensis.

    PubMed

    Brassey, Charlotte A; O'Mahoney, Thomas G; Chamberlain, Andrew T; Sellers, William I

    2018-02-01

    Fossil body mass estimation is a well established practice within the field of physical anthropology. Previous studies have relied upon traditional allometric approaches, in which the relationship between one/several skeletal dimensions and body mass in a range of modern taxa is used in a predictive capacity. The lack of relatively complete skeletons has thus far limited the potential application of alternative mass estimation techniques, such as volumetric reconstruction, to fossil hominins. Yet across vertebrate paleontology more broadly, novel volumetric approaches are resulting in predicted values for fossil body mass very different to those estimated by traditional allometry. Here we present a new digital reconstruction of Australopithecus afarensis (A.L. 288-1; 'Lucy') and a convex hull-based volumetric estimate of body mass. The technique relies upon identifying a predictable relationship between the 'shrink-wrapped' volume of the skeleton and known body mass in a range of modern taxa, and subsequent application to an articulated model of the fossil taxa of interest. Our calibration dataset comprises whole body computed tomography (CT) scans of 15 species of modern primate. The resulting predictive model is characterized by a high correlation coefficient (r 2  = 0.988) and a percentage standard error of 20%, and performs well when applied to modern individuals of known body mass. Application of the convex hull technique to A. afarensis results in a relatively low body mass estimate of 20.4 kg (95% prediction interval 13.5-30.9 kg). A sensitivity analysis on the articulation of the chest region highlights the sensitivity of our approach to the reconstruction of the trunk, and the incomplete nature of the preserved ribcage may explain the low values for predicted body mass here. We suggest that the heaviest of previous estimates would require the thorax to be expanded to an unlikely extent, yet this can only be properly tested when more complete fossils are available. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Applications of absorption spectroscopy using quantum cascade lasers.

    PubMed

    Zhang, Lizhu; Tian, Guang; Li, Jingsong; Yu, Benli

    2014-01-01

    Infrared laser absorption spectroscopy (LAS) is a promising modern technique for sensing trace gases with high sensitivity, selectivity, and high time resolution. Mid-infrared quantum cascade lasers, operating in a pulsed or continuous wave mode, have potential as spectroscopic sources because of their narrow linewidths, single mode operation, tunability, high output power, reliability, low power consumption, and compactness. This paper reviews some important developments in modern laser absorption spectroscopy based on the use of quantum cascade laser (QCL) sources. Among the various laser spectroscopic methods, this review is focused on selected absorption spectroscopy applications of QCLs, with particular emphasis on molecular spectroscopy, industrial process control, combustion diagnostics, and medical breath analysis.

  2. Detection of Gunshot Residues Using Mass Spectrometry

    PubMed Central

    Blanes, Lucas; Cole, Nerida; Doble, Philip; Roux, Claude

    2014-01-01

    In recent years, forensic scientists have become increasingly interested in the detection and interpretation of organic gunshot residues (OGSR) due to the increasing use of lead- and heavy metal-free ammunition. This has also been prompted by the identification of gunshot residue- (GSR-) like particles in environmental and occupational samples. Various techniques have been investigated for their ability to detect OGSR. Mass spectrometry (MS) coupled to a chromatographic system is a powerful tool due to its high selectivity and sensitivity. Further, modern MS instruments can detect and identify a number of explosives and additives which may require different ionization techniques. Finally, MS has been applied to the analysis of both OGSR and inorganic gunshot residue (IGSR), although the “gold standard” for analysis is scanning electron microscopy with energy dispersive X-ray microscopy (SEM-EDX). This review presents an overview of the technical attributes of currently available MS and ionization techniques and their reported applications to GSR analysis. PMID:24977168

  3. Orientational imaging of a single plasmonic nanoparticle using dark-field hyperspectral imaging

    NASA Astrophysics Data System (ADS)

    Mehta, Nishir; Mahigir, Amirreza; Veronis, Georgios; Gartia, Manas Ranjan

    2017-08-01

    Orientation of plasmonic nanostructures is an important feature in many nanoscale applications such as catalyst, biosensors DNA interactions, protein detections, hotspot of surface enhanced Raman spectroscopy (SERS), and fluorescence resonant energy transfer (FRET) experiments. However, due to diffraction limit, it is challenging to obtain the exact orientation of the nanostructure using standard optical microscope. Hyperspectral Imaging Microscopy is a state-of-the-art visualization technology that combines modern optics with hyperspectral imaging and computer system to provide the identification and quantitative spectral analysis of nano- and microscale structures. In this work, initially we use transmitted dark field imaging technique to locate single nanoparticle on a glass substrate. Then we employ hyperspectral imaging technique at the same spot to investigate orientation of single nanoparticle. No special tagging or staining of nanoparticle has been done, as more likely required in traditional microscopy techniques. Different orientations have been identified by carefully understanding and calibrating shift in spectral response from each different orientations of similar sized nanoparticles. Wavelengths recorded are between 300 nm to 900 nm. The orientations measured by hyperspectral microscopy was validated using finite difference time domain (FDTD) electrodynamics calculations and scanning electron microscopy (SEM) analysis. The combination of high resolution nanometer-scale imaging techniques and the modern numerical modeling capacities thus enables a meaningful advance in our knowledge of manipulating and fabricating shaped nanostructures. This work will advance our understanding of the behavior of small nanoparticle clusters useful for sensing, nanomedicine, and surface sciences.

  4. Estrutura do Verbo no Portugues Coloquial (Verb Structure in Colloquial Portuguese).

    ERIC Educational Resources Information Center

    Pontes, Eunice

    In this study the author uses the techniques of modern descriptive linguistics to analyze various features of the Portuguese verb system. The analysis is based on the colloquial, spontaneous speech of educated natives of Rio de Janeiro and is divided into four chapters: Phonology (pp. 6-29), Morphophonemics (pp. 30-49), Morphology (pp. 50-86), and…

  5. The Evaluation of Oral/Aural Skills within the BA Finals Examination: Analysis and Interim Proposals.

    ERIC Educational Resources Information Center

    Crawshaw, Robert

    An examination of the principles and techniques of oral testing in British university-level final examinations in modern languages discusses: (1) the shortcomings of present oral testing procedures; (2) the theoretical controversy surrounding the design and value of oral proficiency tests, arising from research in English as a second language…

  6. Modern Data Analysis techniques in Noise and Vibration Problems

    DTIC Science & Technology

    1981-11-01

    Hilbert l’une de l’autre. Cette propriete se retrouve dans l’etude de la causalite : ce qui de- finit un critere pratique caracterisant un signal donc, par...entre Ie champ direct et Ie champ reflechi se caracterisent loca- lement par l’existence de frequences pour lesquelles l’interference est totale

  7. Life in the Pinball Machine: Looking Back with Bob Mager

    ERIC Educational Resources Information Center

    Taylor, Ray

    2005-01-01

    At 81 years old, Robert F. (Bob) Mager is the granddaddy of modern performance analysis and instructional design techniques. Although he has retired from the profession, he is still actively learning. He is currently working on his fourth novel, and is also an award-winning ventriloquist and is taking flamenco lessons. Perhaps best known in the…

  8. A complexity science-based framework for global joint operations analysis to support force projection: LDRD Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, Craig R.

    2015-01-01

    The military is undergoing a significant transformation as it modernizes for the information age and adapts to address an emerging asymmetric threat beyond traditional cold war era adversaries. Techniques such as traditional large-scale, joint services war gaming analysis are no longer adequate to support program evaluation activities and mission planning analysis at the enterprise level because the operating environment is evolving too quickly. New analytical capabilities are necessary to address modernization of the Department of Defense (DoD) enterprise. This presents significant opportunity to Sandia in supporting the nation at this transformational enterprise scale. Although Sandia has significant experience with engineeringmore » system of systems (SoS) and Complex Adaptive System of Systems (CASoS), significant fundamental research is required to develop modeling, simulation and analysis capabilities at the enterprise scale. This report documents an enterprise modeling framework which will enable senior level decision makers to better understand their enterprise and required future investments.« less

  9. Task 7: Endwall treatment inlet flow distortion analysis

    NASA Technical Reports Server (NTRS)

    Hall, E. J.; Topp, D. A.; Heidegger, N. J.; McNulty, G. S.; Weber, K. F.; Delaney, R. A.

    1996-01-01

    The overall objective of this study was to develop a 3-D numerical analysis for compressor casing treatment flowfields, and to perform a series of detailed numerical predictions to assess the effectiveness of various endwall treatments for enhancing the efficiency and stall margin of modern high speed fan rotors. Particular attention was given to examining the effectiveness of endwall treatments to counter the undesirable effects of inflow distortion. Calculations were performed using three different gridding techniques based on the type of casing treatment being tested and the level of complexity desired in the analysis. In each case, the casing treatment itself is modeled as a discrete object in the overall analysis, and the flow through the casing treatment is determined as part of the solution. A series of calculations were performed for both treated and untreated modern fan rotors both with and without inflow distortion. The effectiveness of the various treatments were quantified, and several physical mechanisms by which the effectiveness of endwall treatments is achieved are discussed.

  10. C3: A Command-line Catalogue Cross-matching tool for modern astrophysical survey data

    NASA Astrophysics Data System (ADS)

    Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio

    2017-06-01

    In the current data-driven science era, it is needed that data analysis techniques has to quickly evolve to face with data whose dimensions has increased up to the Petabyte scale. In particular, being modern astrophysics based on multi-wavelength data organized into large catalogues, it is crucial that the astronomical catalog cross-matching methods, strongly dependant from the catalogues size, must ensure efficiency, reliability and scalability. Furthermore, multi-band data are archived and reduced in different ways, so that the resulting catalogues may differ each other in formats, resolution, data structure, etc, thus requiring the highest generality of cross-matching features. We present C 3 (Command-line Catalogue Cross-match), a multi-platform application designed to efficiently cross-match massive catalogues from modern surveys. Conceived as a stand-alone command-line process or a module within generic data reduction/analysis pipeline, it provides the maximum flexibility, in terms of portability, configuration, coordinates and cross-matching types, ensuring high performance capabilities by using a multi-core parallel processing paradigm and a sky partitioning algorithm.

  11. Image Quality Analysis of Various Gastrointestinal Endoscopes: Why Image Quality Is a Prerequisite for Proper Diagnostic and Therapeutic Endoscopy

    PubMed Central

    Ko, Weon Jin; An, Pyeong; Ko, Kwang Hyun; Hahm, Ki Baik; Hong, Sung Pyo

    2015-01-01

    Arising from human curiosity in terms of the desire to look within the human body, endoscopy has undergone significant advances in modern medicine. Direct visualization of the gastrointestinal (GI) tract by traditional endoscopy was first introduced over 50 years ago, after which fairly rapid advancement from rigid esophagogastric scopes to flexible scopes and high definition videoscopes has occurred. In an effort towards early detection of precancerous lesions in the GI tract, several high-technology imaging scopes have been developed, including narrow band imaging, autofocus imaging, magnified endoscopy, and confocal microendoscopy. However, these modern developments have resulted in fundamental imaging technology being skewed towards red-green-blue and this technology has obscured the advantages of other endoscope techniques. In this review article, we have described the importance of image quality analysis using a survey to consider the diversity of endoscope system selection in order to better achieve diagnostic and therapeutic goals. The ultimate aims can be achieved through the adoption of modern endoscopy systems that obtain high image quality. PMID:26473119

  12. An east Asian perspective of mind-body.

    PubMed

    Nagatomo, S; Leisman, G

    1996-08-01

    This paper addresses a need to re-examine the mind-body dualism established since Descartes. Descartes' dualism has been regarded by modern philosophers as an extremely insufficient solution to the problem of mind and body, from which is derived a long opposition in modern epistomology between idealism and empiricism. This dualism, bifurcating the region of spirit and matter, and the dichotomous models of thinking based on this dualism, have long dominated the world of modern philosophy and science. The paper examines states of conscious experience from an East Asian perspective allowing analysis on achieved supernormal consciousness rather than a focus on "normal" or "subnormal." The nature of the "transformation" of human consciousness will be studied both philosophically, as a transformation from "provisional" dualism to non-dualism, and neurophysiologically. The theoretical structure of the transformation will, in part, be examined through the model provided by a Japanese medieval Zen master, Takuan Sôhô. Then, to verify Takuan's theoretical explanation, toposcopic analysis of electroencephalographs will be presented of the performance of individuals practicing the martial arts technique of tôate.

  13. Yoga and mental health: A dialogue between ancient wisdom and modern psychology

    PubMed Central

    Vorkapic, Camila Ferreira

    2016-01-01

    Background: Many yoga texts make reference to the importance of mental health and the use of specific techniques in the treatment of mental disorders. Different concepts utilized in modern psychology may not come with contemporary ideas, instead, they seem to share a common root with ancient wisdom. Aims: The goal of this perspective article is to correlate modern techniques used in psychology and psychiatry with yogic practices, in the treatment of mental disorders. Materials and Methods: The current article presented a dialogue between the yogic approach for the treatment of mental disorder and concepts used in modern psychology, such as meta-cognition, disidentification, deconditioning and interoceptive exposure. Conclusions: Contemplative research found out that modern interventions in psychology might not come from modern concepts after all, but share great similarity with ancient yogic knowledge, giving us the opportunity to integrate the psychological wisdom of both East and West. PMID:26865774

  14. Evaluation of portable Raman spectroscopy and handheld X-ray fluorescence analysis (hXRF) for the direct analysis of glyptics.

    PubMed

    Lauwers, D; Candeias, A; Coccato, A; Mirao, J; Moens, L; Vandenabeele, P

    2016-03-15

    In archaeometry, the advantages of a combined use of Raman spectroscopy and X-ray fluorescence spectroscopy are extensively discussed for applications such as the analysis of paintings, manuscripts, pottery, etc. Here, we demonstrate for the first time the advantage of using both techniques for analysing glyptics. These engraved gemstones or glass materials were originally used as stamps, to identify the owner, for instance on letters, but also on wine vessels. For this research, a set of 64 glyptics (42 Roman glass specimens and 22 modern ones), belonging to the collection of the museum 'Quinta das Cruzes' in Funchal (Madeira, Portugal), was analysed with portable Raman spectroscopy and handheld X-ray fluorescence (hXRF). These techniques were also used to confirm the gemological identification of these precious objects and can give extra information about the glass composition. Raman spectroscopy identifies the molecular composition as well as on the crystalline phases present. On the other hand, hXRF results show that the antique Roman glass samples are characterised with low Pb and Sn levels and that the modern specimens can be discriminated in two groups: lead-based and non-lead-based ones. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts

    ERIC Educational Resources Information Center

    Nunnery, Christopher Edward

    2011-01-01

    Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…

  16. The "Prediflood" database of historical floods in Catalonia (NE Iberian Peninsula) AD 1035-2013, and its potential applications in flood analysis

    NASA Astrophysics Data System (ADS)

    Barriendos, M.; Ruiz-Bellet, J. L.; Tuset, J.; Mazón, J.; Balasch, J. C.; Pino, D.; Ayala, J. L.

    2014-12-01

    "Prediflood" is a database of historical floods that occurred in Catalonia (NE Iberian Peninsula), between the 11th century and the 21st century. More than 2700 flood cases are catalogued, and more than 1100 flood events. This database contains information acquired under modern historiographical criteria and it is, therefore, suitable for use in multidisciplinary flood analysis techniques, such as meteorological or hydraulic reconstructions.

  17. Development and application of a time-history analysis for rotorcraft dynamics based on a component approach

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Hallock, D. W.

    1985-01-01

    A time history analysis for rotorcraft dynamics based on dynamical substructures, and nonstructural mathematical and aerodynamic components is described. The analysis is applied to predict helicopter ground resonance and response to rotor damage. Other applications illustrate the stability and steady vibratory response of stopped and gimballed rotors, representative of new technology. Desirable attributes expected from modern codes are realized, although the analysis does not employ a complete set of techniques identified for advanced software. The analysis is able to handle a comprehensive set of steady state and stability problems with a small library of components.

  18. Analysis of intracranial pressure: past, present, and future.

    PubMed

    Di Ieva, Antonio; Schmitz, Erika M; Cusimano, Michael D

    2013-12-01

    The monitoring of intracranial pressure (ICP) is an important tool in medicine for its ability to portray the brain's compliance status. The bedside monitor displays the ICP waveform and intermittent mean values to guide physicians in the management of patients, particularly those having sustained a traumatic brain injury. Researchers in the fields of engineering and physics have investigated various mathematical analysis techniques applicable to the waveform in order to extract additional diagnostic and prognostic information, although they largely remain limited to research applications. The purpose of this review is to present the current techniques used to monitor and interpret ICP and explore the potential of using advanced mathematical techniques to provide information about system perturbations from states of homeostasis. We discuss the limits of each proposed technique and we propose that nonlinear analysis could be a reliable approach to describe ICP signals over time, with the fractal dimension as a potential predictive clinically meaningful biomarker. Our goal is to stimulate translational research that can move modern analysis of ICP using these techniques into widespread practical use, and to investigate to the clinical utility of a tool capable of simplifying multiple variables obtained from various sensors.

  19. Efficient and robust analysis of complex scattering data under noise in microwave resonators.

    PubMed

    Probst, S; Song, F B; Bushev, P A; Ustinov, A V; Weides, M

    2015-02-01

    Superconducting microwave resonators are reliable circuits widely used for detection and as test devices for material research. A reliable determination of their external and internal quality factors is crucial for many modern applications, which either require fast measurements or operate in the single photon regime with small signal to noise ratios. Here, we use the circle fit technique with diameter correction and provide a step by step guide for implementing an algorithm for robust fitting and calibration of complex resonator scattering data in the presence of noise. The speedup and robustness of the analysis are achieved by employing an algebraic rather than an iterative fit technique for the resonance circle.

  20. 3D polymer gel dosimetry using a 3D (DESS) and a 2D MultiEcho SE (MESE) sequence

    NASA Astrophysics Data System (ADS)

    Maris, Thomas G.; Pappas, Evangelos; Karolemeas, Kostantinos; Papadakis, Antonios E.; Zacharopoulou, Fotini; Papanikolaou, Nickolas; Gourtsoyiannis, Nicholas

    2006-12-01

    The utilization of 3D techniques in Magnetic Resonance Imaging data aquisition and post-processing analysis is a prerequisite especially when modern radiotherapy techniques (conformal RT, IMRT, Stereotactic RT) are to be used. The aim of this work is to compare a 3D Double Echo Steady State (DESS) and a 2D Multiple Echo Spin Echo (MESE) sequence in 3D MRI radiation dosimetry using two different MRI scanners and utilising N-VInylPyrrolidone (VIPAR) based polymer gels.

  1. GIS-based identification of active lineaments within the Krasnokamensk Area, Transbaikalia, Russia

    NASA Astrophysics Data System (ADS)

    Petrov, V. A.; Lespinasse, M.; Ustinov, S. A.; Cialec, C.

    2017-07-01

    Lineament analysis was carried out using detailed digital elevation models (DEM) of the Krasnokamensk Area, southeastern Transbaikalia (Russia). The results of this research confirm the presence of already known faults, but also identify unknown fault zones. The primary focus was identifying small discontinuities and their relationship with extended fault zones. The developed technique allowed construction and identification of the active lineaments with their orientation of the compression and expansion axes in the horizontal plane, their direction of shear movement (right or left), and their geodynamic setting of formation (compression or stretching). The results of active faults identification and definition of their kinematics on digital elevation models were confirmed by measuring the velocities and directions of modern horizontal surface motions using a geodesic GPS, as well as identifying the principal stress axes directions of the modern stress field using modern-day earthquake data. The obtained results are deemed necessary for proper rational environmental management decisions.

  2. New quality assurance program integrating "modern radiotherapy" within the German Hodgkin Study Group.

    PubMed

    Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T

    2017-02-01

    Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.

  3. A solution for future designs using techniques from vernacular architecture in southern Iran

    NASA Astrophysics Data System (ADS)

    Mirahmadi, Fatima; Altan, Hasim

    2018-02-01

    Nowadays in modern life, every technology and technique for comfortable life is available. People with low income, in other words, with low levels of economic power, can also have those facilities to stay warm in winter and stay cool in summer. Many years back when there were no advanced systems for human needs, passive strategies played a big role in peoples' lives. This paper concentrates on a small city in Iran that had used special strategies to solve peoples' environmental issues. The city is called Evaz, which is located in the Fars region of Iran with distance around 20 km from Gerash city and 370 km from south east of Shiraz. Evaz receives minimum rainfall, which is the reason why water is limited in this area and therefore, cisterns (water storage) had been used for many years that is studied in more detail in this paper. In summers, the climate is hot and dry, sometimes the external temperatures reaching around 46 °C during the day. Although the winters are typically cold and likewise dry, moderate climate is available in Evaz during autumn and spring. This study identifies some of the past strategies and describes them in detail with analysis for transformation and connections with the modern and traditional fundamentals. Furthermore, the study develops some solutions utilizing a combination of both modern and traditional techniques in design to suggest better and more effective ways to save energy, and at the same time to remain sustainable for the future.

  4. Analysis of Oxygen, Anaesthesia Agent and Flows in Anaesthesia Machine

    PubMed Central

    Garg, Rakesh; Gupta, Ramesh Chand

    2013-01-01

    The technical advancement in the anaesthesia workstations has made the peri-operative anaesthesia more safer. Apart from other monitoring options, respiratory gas analysis has become an integral part of the modern anaesthesia workstations. Monitoring devices, such as an oxygen analyser with an audible alarm, carbon dioxide analyser, a vapour analyser, whenever a volatile anaesthetic is delivered have also been recommended by various anaesthesia societies. This review article discusses various techniques for analysis of flow, volumes and concentration of various anaesthetic agents including oxygen, nitrous oxide and volatile anaesthetic agents. PMID:24249881

  5. Cluster Analysis in Sociometric Research: A Pattern-Oriented Approach to Identifying Temporally Stable Peer Status Groups of Girls

    ERIC Educational Resources Information Center

    Zettergren, Peter

    2007-01-01

    A modern clustering technique was applied to age-10 and age-13 sociometric data with the purpose of identifying longitudinally stable peer status clusters. The study included 445 girls from a Swedish longitudinal study. The identified temporally stable clusters of rejected, popular, and average girls were essentially larger than corresponding…

  6. Portable Raman monitoring of modern cleaning and consolidation operations of artworks on mineral supports.

    PubMed

    Martínez-Arkarazo, I; Sarmiento, A; Maguregui, M; Castro, K; Madariaga, J M

    2010-08-01

    Any restoration performed on cultural heritage artworks must guarantee a low impact on the treated surfaces. Although completely risk-free methods do not exist, the use of tailor-made procedures and the continuous monitoring by portable instrumentation is surely one of the best approaches to conduct a modern restoration process. In this work, a portable Raman monitoring, combined sometimes with spectroscopic techniques providing the elemental composition, is the key analysis technique in the three-step restoration protocol proposed: (a) in situ analysis of the surface to be treated (original composition and degradation products/pollutants) and the cleaning agents used as extractants, (b) the thermodynamic study of the species involved in the treatment in order to design a suitable restoration method and (c) application and monitoring of the treatment. Two cleaning operations based on new technologies were studied and applied to two artworks on mineral supports: a wall painting affected by nitrate impact, and a black crusted stone (chalk) altarpiece. Raman bands of nitrate and gypsum, respectively, decreased after the step-by-step operations in each case, which helped restorers to decide when the treatment was concluded, thus avoiding any further damage to the treated surface of the artworks.

  7. Professional Competence of a Teacher in Higher Educational Institution

    ERIC Educational Resources Information Center

    Abykanova, Bakytgul; Tashkeyeva, Gulmira; Idrissov, Salamat; Bilyalova, Zhupar; Sadirbekova, Dinara

    2016-01-01

    Modern reality brings certain corrections to the understanding of forms and methods of teaching various courses in higher educational institution. A special role among the educational techniques and means in the college educational environment is taken by the modern technologies, such as using the techniques, means and ways, which are aimed at…

  8. Community to Classroom: Reflections on Community-Centered Pedagogy in Contemporary Modern Dance Technique

    ERIC Educational Resources Information Center

    Fitzgerald, Mary

    2017-01-01

    This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…

  9. Taming the Wild: A Unified Analysis of Hogwild!-Style Algorithms.

    PubMed

    De Sa, Christopher; Zhang, Ce; Olukotun, Kunle; Ré, Christopher

    2015-12-01

    Stochastic gradient descent (SGD) is a ubiquitous algorithm for a variety of machine learning problems. Researchers and industry have developed several techniques to optimize SGD's runtime performance, including asynchronous execution and reduced precision. Our main result is a martingale-based analysis that enables us to capture the rich noise models that may arise from such techniques. Specifically, we use our new analysis in three ways: (1) we derive convergence rates for the convex case (Hogwild!) with relaxed assumptions on the sparsity of the problem; (2) we analyze asynchronous SGD algorithms for non-convex matrix problems including matrix completion; and (3) we design and analyze an asynchronous SGD algorithm, called Buckwild!, that uses lower-precision arithmetic. We show experimentally that our algorithms run efficiently for a variety of problems on modern hardware.

  10. Forensic analysis of dyed textile fibers.

    PubMed

    Goodpaster, John V; Liszewski, Elisa A

    2009-08-01

    Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.

  11. Quantitatively Excessive Normal Tissue Toxicity and Poor Target Coverage in Postoperative Lung Cancer Radiotherapy Meta-analysis.

    PubMed

    Abuodeh, Yazan; Naghavi, Arash O; Echevarria, Michelle; DeMarco, MaryLou; Tonner, Brian; Feygelman, Vladimir; Stevens, Craig W; Perez, Bradford A; Dilling, Thomas J

    2018-01-01

    A previous meta-analysis (MA) found postoperative radiotherapy (PORT) in lung cancer patients to be detrimental in N0/N1 patients and equivocal in the N2 setting. We hypothesized that treatment plans generated using MA protocols had worse dosimetric outcomes compared to modern plans. We retrieved plans for 13 patients who received PORT with modern planning. A plan was recreated for each patient using the 8 protocols included in MA. Dosimetric values were then compared between the modern and simulated MA plans. A total of 104 MA plans were generated. Median prescribed dose was 50.4 (range, 50-60) Gy in the modern plans and 53.2 (30-60) Gy in the MA protocols. Median planning volume coverage was 96% (93%-100%) in the modern plans, versus 58% (0%-100%) in the MA plans (P < .001). Internal target volume coverage was 100% (99%-100%) versus 65% (0%-100%), respectively (P < .001). Organs at risk received the following doses: spinal cord maximum dose, 36.8 (4.6-50.4) Gy versus 46.8 (2.9-74.0) Gy (P < .001); esophageal mean dose, 22.9 (5.5-35) Gy versus 30.5 (11.1-52.5) Gy (P = .003); heart V30 (percentage of volume of an organ receiving at least a dose of 30 Gy), 16% (0%-45%) versus 35% (0%-79%) (P = .047); mean lung dose, 12.4 (3.4-24.3) Gy versus 14.8 (4.1-27.4) Gy (P = .008); and lung V20, 18% (4%-34%) versus 25% (8%-67%) (P = .023). We quantitatively confirm the inferiority of the techniques used in the PORT MA. Our analysis showed a lower therapeutic ratio in the MA plans, which may explain the poor outcomes in the MA. The findings of the MA are not relevant in the era of modern treatment planning. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Double-negative metamaterial for mobile phone application

    NASA Astrophysics Data System (ADS)

    Hossain, M. I.; Faruque, M. R. I.; Islam, M. T.

    2017-01-01

    In this paper, a new design and analysis of metamaterial and its applications to modern handset are presented. The proposed metamaterial unit-cell design consists of two connected square spiral structures, which leads to increase the effective media ratio. The finite instigation technique based on Computer Simulation Technology Microwave Studio is utilized in this investigation, and the measurement is taken in an anechoic chamber. A good agreement is observed among simulated and measured results. The results indicate that the proposed metamaterial can successfully cover cellular phone frequency bands. Moreover, the uses of proposed metamaterial in modern handset antennas are also analyzed. The results reveal that the proposed metamaterial attachment significantly reduces specific absorption rate values without reducing the antenna performances.

  13. Advanced techniques in placental biology -- workshop report.

    PubMed

    Nelson, D M; Sadovsky, Y; Robinson, J M; Croy, B A; Rice, G; Kniss, D A

    2006-04-01

    Major advances in placental biology have been realized as new technologies have been developed and existing methods have been refined in many areas of biological research. Classical anatomy and whole-organ physiology tools once used to analyze placental structure and function have been supplanted by more sophisticated techniques adapted from molecular biology, proteomics, and computational biology and bioinformatics. In addition, significant refinements in morphological study of the placenta and its constituent cell types have improved our ability to assess form and function in highly integrated manner. To offer an overview of modern technologies used by investigators to study the placenta, this workshop: Advanced techniques in placental biology, assembled experts who discussed fundamental principles and real time examples of four separate methodologies. Y. Sadovsky presented the principles of microRNA function as an endogenous mechanism of gene regulation. J. Robinson demonstrated the utility of correlative microscopy in which light-level and transmission electron microscopy are combined to provide cellular and subcellular views of placental cells. A. Croy provided a lecture on the use of microdissection techniques which are invaluable for isolating very small subsets of cell types for molecular analysis. Finally, G. Rice presented an overview methods on profiling of complex protein mixtures within tissue and/or fluid samples that, when refined, will offer databases that will underpin a systems approach to modern trophoblast biology.

  14. Cofactors in the RNA World

    NASA Technical Reports Server (NTRS)

    Ditzler, Mark A.

    2014-01-01

    RNA world theories figure prominently in many scenarios for the origin and early evolution of life. These theories posit that RNA molecules played a much larger role in ancient biology than they do now, acting both as the dominant biocatalysts and as the repository of genetic information. Many features of modern RNA biology are potential examples of molecular fossils from an RNA world, such as the pervasive involvement of nucleotides in coenzymes, the existence of natural aptamers that bind these coenzymes, the existence of natural ribozymes, a biosynthetic pathway in which deoxynucleotides are produced from ribonucleotides, and the central role of ribosomal RNA in protein synthesis in the peptidyl transferase center of the ribosome. Here, we uses both a top-down approach that evaluates RNA function in modern biology and a bottom-up approach that examines the capacities of RNA independent of modern biology. These complementary approaches exploit multiple in vitro evolution techniques coupled with high-throughput sequencing and bioinformatics analysis. Together these complementary approaches advance our understanding of the most primitive organisms, their early evolution, and their eventual transition to modern biochemistry.

  15. Introduction to the Minireview Series on Modern Technologies for In-cell Biochemistry.

    PubMed

    Lutsenko, Svetlana

    2016-02-19

    The last decade has seen enormous progress in the exploration and understanding of the behavior of molecules in their natural cellular environments at increasingly high spatial and temporal resolution. Advances in microscopy and the development of new fluorescent reagents as well as genetic editing techniques have enabled quantitative analysis of protein interactions, intracellular trafficking, metabolic changes, and signaling. Modern biochemistry now faces new and exciting challenges. Can traditionally "in vitro" experiments, e.g. analysis of protein folding and conformational transitions, be done in cells? Can the structure and behavior of endogenous and/or non-tagged recombinant proteins be analyzed and altered within the cell or in cellular compartments? How can molecules and their actions be studied mechanistically in tissues and organs? Is personalized cellular biochemistry a reality? This thematic series summarizes recent studies that illustrate some first steps toward successfully answering these modern biochemical questions. The first minireview focuses on utilization of three-dimensional primary enteroids and organoids for mechanistic studies of intestinal biology with molecular resolution. The second minireview describes application of single chain antibodies (nanobodies) for monitoring and regulating protein dynamics in vitro and in cells. The third minireview highlights advances in using NMR spectroscopy for analysis of protein folding and assembly in cells. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Contemporary imaging of mild TBI: the journey toward diffusion tensor imaging to assess neuronal damage.

    PubMed

    Fox, W Christopher; Park, Min S; Belverud, Shawn; Klugh, Arnett; Rivet, Dennis; Tomlin, Jeffrey M

    2013-04-01

    To follow the progression of neuroimaging as a means of non-invasive evaluation of mild traumatic brain injury (mTBI) in order to provide recommendations based on reproducible, defined imaging findings. A comprehensive literature review and analysis of contemporary published articles was performed to study the progression of neuroimaging findings as a non-invasive 'biomarker' for mTBI. Multiple imaging modalities exist to support the evaluation of patients with mTBI, including ultrasound (US), computed tomography (CT), single photon emission computed tomography (SPECT), positron emission tomography (PET), and magnetic resonance imaging (MRI). These techniques continue to evolve with the development of fractional anisotropy (FA), fiber tractography (FT), and diffusion tensor imaging (DTI). Modern imaging techniques, when applied in the appropriate clinical setting, may serve as a valuable tool for diagnosis and management of patients with mTBI. An understanding of modern neuroanatomical imaging will enhance our ability to analyse injury and recognize the manifestations of mTBI.

  17. Hypothesis Testing, "p" Values, Confidence Intervals, Measures of Effect Size, and Bayesian Methods in Light of Modern Robust Techniques

    ERIC Educational Resources Information Center

    Wilcox, Rand R.; Serang, Sarfaraz

    2017-01-01

    The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…

  18. Constraints on primary and secondary particulate carbon sources using chemical tracer and 14C methods during CalNex-Bakersfield

    NASA Astrophysics Data System (ADS)

    Sheesley, Rebecca J.; Nallathamby, Punith Dev; Surratt, Jason D.; Lee, Anita; Lewandowski, Michael; Offenberg, John H.; Jaoui, Mohammed; Kleindienst, Tadeusz E.

    2017-10-01

    The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter (PM) in the Bakersfield airshed. To achieve this objective, filter samples were taken during thirty-four 23-hr periods between 19 May and 26 June 2010 and analyzed for organic tracers by gas chromatography - mass spectrometry (GC-MS). Contributions to organic carbon (OC) were determined by two organic tracer-based techniques: primary OC by chemical mass balance and secondary OC by a mass fraction method. Radiocarbon (14C) measurements of the total organic carbon were also made to determine the split between the modern and fossil carbon and thereby constrain unknown sources of OC not accounted for by either tracer-based attribution technique. From the analysis, OC contributions from four primary sources and four secondary sources were determined, which comprised three sources of modern carbon and five sources of fossil carbon. The major primary sources of OC were from vegetative detritus (9.8%), diesel (2.3%), gasoline (<1.0%), and lubricating oil impacted motor vehicle exhaust (30%); measured secondary sources resulted from isoprene (1.5%), α-pinene (<1.0%), toluene (<1.0%), and naphthalene (<1.0%, as an upper limit) contributions. The average observed organic carbon (OC) was 6.42 ± 2.33 μgC m-3. The 14C derived apportionment indicated that modern and fossil components were nearly equivalent on average; however, the fossil contribution ranged from 32 to 66% over the five week campaign. With the fossil primary and secondary sources aggregated, only 25% of the fossil organic carbon could not be attributed. Whereas, nearly 80% of the modern carbon could not be attributed to primary and secondary sources accessible to this analysis, which included tracers of biomass burning, vegetative detritus and secondary biogenic carbon. The results of the current study contributes source-based evaluation of the carbonaceous aerosol at CalNex Bakersfield.

  19. Constraints on primary and secondary particulate carbon sources using chemical tracer and 14C methods during CalNex-Bakersfield

    PubMed Central

    Sheesley, Rebecca J.; Nallathamby, Punith Dev; Surratt, Jason D.; Lee, Anita; Lewandowski, Michael; Offenberg, John H.; Jaoui, Mohammed; Kleindienst, Tadeusz E.

    2018-01-01

    The present study investigates primary and secondary sources of organic carbon for Bakersfield, CA, USA as part of the 2010 CalNex study. The method used here involves integrated sampling that is designed to allow for detailed and specific chemical analysis of particulate matter (PM) in the Bakersfield airshed. To achieve this objective, filter samples were taken during thirty-four 23-hr periods between 19 May and 26 June 2010 and analyzed for organic tracers by gas chromatography – mass spectrometry (GC-MS). Contributions to organic carbon (OC) were determined by two organic tracer-based techniques: primary OC by chemical mass balance and secondary OC by a mass fraction method. Radiocarbon (14C) measurements of the total organic carbon were also made to determine the split between the modern and fossil carbon and thereby constrain unknown sources of OC not accounted for by either tracer-based attribution technique. From the analysis, OC contributions from four primary sources and four secondary sources were determined, which comprised three sources of modern carbon and five sources of fossil carbon. The major primary sources of OC were from vegetative detritus (9.8%), diesel (2.3%), gasoline (<1.0%), and lubricating oil impacted motor vehicle exhaust (30%); measured secondary sources resulted from isoprene (1.5%), α-pinene (<1.0%), toluene (<1.0%), and naphthalene (<1.0%, as an upper limit) contributions. The average observed organic carbon (OC) was 6.42 ± 2.33 μgC m−3. The 14C derived apportionment indicated that modern and fossil components were nearly equivalent on average; however, the fossil contribution ranged from 32-66% over the five week campaign. With the fossil primary and secondary sources aggregated, only 25% of the fossil organic carbon could not be attributed. Whereas, nearly 80% of the modern carbon could not be attributed to primary and secondary sources accessible to this analysis, which included tracers of biomass burning, vegetative detritus and secondary biogenic carbon. The results of the current study contributes source-based evaluation of the carbonaceous aerosol at CalNex Bakersfield. PMID:29681757

  20. Laughter in popular games and in sport. The other health of human play.

    PubMed

    Eichberg, Henning

    2013-01-01

    Hurling in Cornwall, la soule in Britanny, Shrovetide football in England: Popular games have normally been treated as forerunners of modern sport, sport having regulated the space and the time of the game, the (non-) violence of behaviour, the control of results, the planning, strategy, tactics, techniques and evaluation of the competitive action. This is told as a story of social improvement and progress--and about turning unhealthy wildness into civilized 'healthy' sport activity. What sociological analysis of game-playing tended to ignore was the laughter of the participants. With the seriousness of modern sport, as it was established in the nineteenth century, a culture of laughter disappeared. This study tries to counter this mainstream by a phenomenology of laughter in popular games. A contrasting attention is turned towards the seriousness of sporting competition, the smile in modern sport and fitness, and the 'underground' dimension of laughter in modern sports. By comparative analysis, laughter reveals as a bodily discourse about the imperfect human being. It tells an oppositional story about the perfectionism in the order of Western thinking--in sports as well as in health. The bodily 'physiology' of laughter, the exploding psychical energy, and the inter-bodily social relations in laughter and play and game point towards the multi-dimensionality of health, as it was formulated by WHO: as "physical, mental, and social well-being".

  1. Reconstructing Tropical Southwest Pacific Climate Variability and Mean State Changes at Vanuatu during the Medieval Climate Anomaly using Geochemical Proxies from Corals

    NASA Astrophysics Data System (ADS)

    Lawman, A. E.; Quinn, T. M.; Partin, J. W.; Taylor, F. W.; Thirumalai, K.; WU, C. C.; Shen, C. C.

    2017-12-01

    The Medieval Climate Anomaly (MCA: 950-1250 CE) is identified as a period during the last 2 millennia with Northern Hemisphere surface temperatures similar to the present. However, our understanding of tropical climate variability during the MCA is poorly constrained due to a lack of sub-annually resolved proxy records. We investigate seasonal and interannual variability during the MCA using geochemical records developed from two well preserved Porites lutea fossilized corals from the tropical southwest Pacific (Tasmaloum, Vanuatu; 15.6°S, 166.9°E). Absolute U/Th dates of 1127.1 ± 2.7 CE and 1105.1 ± 3.0 CE indicate that the selected fossil corals lived during the MCA. We use paired coral Sr/Ca and δ18O measurements to reconstruct sea surface temperature (SST) and the δ18O of seawater (a proxy for salinity). To provide context for the fossil coral records and test whether the mean state and climate variability at Vanuatu during the MCA is similar to the modern climate, our analysis also incorporates two modern coral records from Sabine Bank (15.9°S, 166.0°E) and Malo Channel (15.7°S, 167.2°E), Vanuatu for comparison. We quantify the uncertainty in our modern and fossil coral SST estimates via replication with multiple, overlapping coral records. Both the modern and fossil corals reproduce their respective mean SST value over their common period of overlap, which is 25 years in both cases. Based on over 100 years of monthly Sr/Ca data from each time period, we find that SSTs at Vanuatu during the MCA are 1.3 ± 0.7°C cooler relative to the modern. We also find that the median amplitude of the annual cycle is 0.8 ± 0.3°C larger during the MCA relative to the modern. Multiple data analysis techniques, including the standard deviation and the difference between the 95th and 5th percentiles of the annual SST cycle estimates, also show that the MCA has greater annual SST variability relative to the modern. Stable isotope data acquisition is ongoing, and when complete we will have a suite of records of paired coral Sr/Ca and δ18O measurements. We will apply similar statistical techniques developed for the Sr/Ca-SST record to also investigate variability in the δ18O of seawater (salinity). Modern salinity variability at Vanuatu arises due to hydrological anomalies associated with the El Niño-Southern Oscillation in the tropical Pacific.

  2. Calculating lunar retreat rates using tidal rhythmites

    USGS Publications Warehouse

    Kvale, E.P.; Johnson, H.W.; Sonett, C.P.; Archer, A.W.; Zawistoski, A.N.N.

    1999-01-01

    Tidal rhythmites are small-scale sedimenta??r}- structures that can preserve a hierarchy of astronomically induced tidal periods. They can also preserve a record of periodic nontidal sedimentation. If properly interpreted and understood, tidal rhjthmites can be an important component of paleoastronomy and can be used to extract information on ancient lunar orbital dynamics including changes in Earth-Moon distance through geologic time. Herein we present techniques that can be used to calculate ancient Earth-Moon distances. Each of these techniques, when used on a modern high-tide data set, results in calculated estimates of lunar orbital periods and an EarthMoon distance that fall well within 1 percent of the actual values. Comparisons to results from modern tidal data indicate that ancient tidal rhythmite data as short as 4 months can provide suitable estimates of lunar orbital periods if these tidal records are complete. An understanding of basic tidal theory allows for the evaluation of completeness of the ancient tidal record as derived from an analysis of tidal rhythmites. Utilizing the techniques presented herein, it appears from the rock record that lunar orbital retreat slowed sometime during the midPaleozoic. Copyright ??1999, SEPM (Society for Sedimentary Geology).

  3. Music-therapy analyzed through conceptual mapping

    NASA Astrophysics Data System (ADS)

    Martinez, Rodolfo; de la Fuente, Rebeca

    2002-11-01

    Conceptual maps have been employed lately as a learning tool, as a modern study technique, and as a new way to understand intelligence, which allows for the development of a strong theoretical reference, in order to prove the research hypothesis. This paper presents a music-therapy analysis based on this tool to produce a conceptual mapping network, which ranges from magic through the rigor of the hard sciences.

  4. Scaling Support Vector Machines On Modern HPC Platforms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Yang; Fu, Haohuan; Song, Shuaiwen

    2015-02-01

    We designed and implemented MIC-SVM, a highly efficient parallel SVM for x86 based multicore and many-core architectures, such as the Intel Ivy Bridge CPUs and Intel Xeon Phi co-processor (MIC). We propose various novel analysis methods and optimization techniques to fully utilize the multilevel parallelism provided by these architectures and serve as general optimization methods for other machine learning tools.

  5. On Enthusing Students about Big Data and Social Media Visualization and Analysis Using R, RStudio, and RMarkdown

    ERIC Educational Resources Information Center

    Stander, Julian; Dalla Valle, Luciana

    2017-01-01

    We discuss the learning goals, content, and delivery of a University of Plymouth intensive module delivered over four weeks entitled MATH1608PP Understanding Big Data from Social Networks, aimed at introducing students to a broad range of techniques used in modern Data Science. This module made use of R, accessed through RStudio, and some popular…

  6. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  7. Skin prick tests and allergy diagnosis.

    PubMed

    Antunes, João; Borrego, Luís; Romeira, Ana; Pinto, Paula

    2009-01-01

    Skin testing remains an essential diagnostic tool in modern allergy practice. A significant variability has been reported regarding technical procedures, interpretation of results and documentation. This review has the aim of consolidating methodological recommendations through a critical analysis on past and recent data. This will allow a better understanding on skin prick test (SPT) history; technique; (contra-) indications; interpretation of results; diagnostic pitfalls; adverse reactions; and variability factors.

  8. Error modelling of quantum Hall array resistance standards

    NASA Astrophysics Data System (ADS)

    Marzano, Martina; Oe, Takehiko; Ortolano, Massimo; Callegaro, Luca; Kaneko, Nobu-Hisa

    2018-04-01

    Quantum Hall array resistance standards (QHARSs) are integrated circuits composed of interconnected quantum Hall effect elements that allow the realization of virtually arbitrary resistance values. In recent years, techniques were presented to efficiently design QHARS networks. An open problem is that of the evaluation of the accuracy of a QHARS, which is affected by contact and wire resistances. In this work, we present a general and systematic procedure for the error modelling of QHARSs, which is based on modern circuit analysis techniques and Monte Carlo evaluation of the uncertainty. As a practical example, this method of analysis is applied to the characterization of a 1 MΩ QHARS developed by the National Metrology Institute of Japan. Software tools are provided to apply the procedure to other arrays.

  9. On three-dimensional misorientation spaces.

    PubMed

    Krakow, Robert; Bennett, Robbie J; Johnstone, Duncan N; Vukmanovic, Zoja; Solano-Alvarez, Wilberth; Lainé, Steven J; Einsle, Joshua F; Midgley, Paul A; Rae, Catherine M F; Hielscher, Ralf

    2017-10-01

    Determining the local orientation of crystals in engineering and geological materials has become routine with the advent of modern crystallographic mapping techniques. These techniques enable many thousands of orientation measurements to be made, directing attention towards how such orientation data are best studied. Here, we provide a guide to the visualization of misorientation data in three-dimensional vector spaces, reduced by crystal symmetry, to reveal crystallographic orientation relationships. Domains for all point group symmetries are presented and an analysis methodology is developed and applied to identify crystallographic relationships, indicated by clusters in the misorientation space, in examples from materials science and geology. This analysis aids the determination of active deformation mechanisms and evaluation of cluster centres and spread enables more accurate description of transformation processes supporting arguments regarding provenance.

  10. On three-dimensional misorientation spaces

    NASA Astrophysics Data System (ADS)

    Krakow, Robert; Bennett, Robbie J.; Johnstone, Duncan N.; Vukmanovic, Zoja; Solano-Alvarez, Wilberth; Lainé, Steven J.; Einsle, Joshua F.; Midgley, Paul A.; Rae, Catherine M. F.; Hielscher, Ralf

    2017-10-01

    Determining the local orientation of crystals in engineering and geological materials has become routine with the advent of modern crystallographic mapping techniques. These techniques enable many thousands of orientation measurements to be made, directing attention towards how such orientation data are best studied. Here, we provide a guide to the visualization of misorientation data in three-dimensional vector spaces, reduced by crystal symmetry, to reveal crystallographic orientation relationships. Domains for all point group symmetries are presented and an analysis methodology is developed and applied to identify crystallographic relationships, indicated by clusters in the misorientation space, in examples from materials science and geology. This analysis aids the determination of active deformation mechanisms and evaluation of cluster centres and spread enables more accurate description of transformation processes supporting arguments regarding provenance.

  11. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  12. Online marketing of food and beverages to children: a content analysis.

    PubMed

    Brady, Jennifer; Mendelson, Rena; Farrell, Amber; Wong, Sharon

    2010-01-01

    The goal was to assess websites sponsored by food and beverage manufacturers that have pledged to market branded food and beverage products to children responsibly, by ratifying the Children's Food and Beverage Advertising Initiative (CFBAI). A content analysis was conducted of 24 purposively sampled websites sponsored by 10 companies that promote food and beverage products to children. All are participant members of the CFBAI. Of the 24 websites analyzed, the majority targeted children below age 12 (83%). An array of innovative online marketing techniques, most notably free website membership (63%), leader boards (50%), adver-games (79%), and branded downloadable content (76%), were used to encourage children's engagement with branded food and beverage promotions. Food and beverage manufacturers are engaging children with dynamic online marketing techniques that challenge regulatory codes governing broadcast media. These techniques may contradict the spirit of the CFBAI. Innovative regulatory guidelines are needed to address modern marketing media.

  13. Air-to-air radar flight testing

    NASA Astrophysics Data System (ADS)

    Scott, Randall E.

    1988-06-01

    This volume in the AGARD Flight Test Techniques Series describes flight test techniques, flight test instrumentation, ground simulation, data reduction and analysis methods used to determine the performance characteristics of a modern air-to-air (a/a) radar system. Following a general coverage of specification requirements, test plans, support requirements, development and operational testing, and management information systems, the report goes into more detailed flight test techniques covering a/a radar capabilities of: detection, manual acquisition, automatic acquisition, tracking a single target, and detection and tracking of multiple targets. There follows a section on additional flight test considerations such as electromagnetic compatibility, electronic countermeasures, displays and controls, degraded and backup modes, radome effects, environmental considerations, and use of testbeds. Other sections cover ground simulation, flight test instrumentation, and data reduction and analysis. The final sections deal with reporting and a discussion of considerations for the future and how they may affect radar flight testing.

  14. Microsystems in medicine.

    PubMed

    Wallrabe, U; Ruther, P; Schaller, T; Schomburg, W K

    1998-03-01

    The complexity of modern surgical and analytical methods requires the miniaturisation of many medical devices. The LIGA technique and also mechanical microengineering are well known for the batch fabrication of microsystems. Actuators and sensors are developed based on these techniques. The hydraulic actuation principle is advantageous for medical applications since the energy may be supplied by pressurised balanced salt solution. Some examples are turbines, pumps and valves. In addition, optical sensors and components are useful for analysis and inspection as represented by microspectrometers and spherical lenses. Finally, plastic containers with microporous bottoms allow a 3-dimensional growth of cell culture systems.

  15. Spacecraft Charging Calculations: NASCAP-2K and SEE Spacecraft Charging Handbook

    NASA Technical Reports Server (NTRS)

    Davis, V. A.; Neergaard, L. F.; Mandell, M. J.; Katz, I.; Gardner, B. M.; Hilton, J. M.; Minor, J.

    2002-01-01

    For fifteen years NASA and the Air Force Charging Analyzer Program for Geosynchronous Orbits (NASCAP/GEO) has been the workhorse of spacecraft charging calculations. Two new tools, the Space Environment and Effects (SEE) Spacecraft Charging Handbook (recently released), and Nascap-2K (under development), use improved numeric techniques and modern user interfaces to tackle the same problem. The SEE Spacecraft Charging Handbook provides first-order, lower-resolution solutions while Nascap-2K provides higher resolution results appropriate for detailed analysis. This paper illustrates how the improvements in the numeric techniques affect the results.

  16. Polyatomic interferences on high precision uranium isotope ratio measurements by MC-ICP-MS: Applications to environmental sampling for nuclear safeguards

    DOE PAGES

    Pollington, Anthony D.; Kinman, William S.; Hanson, Susan K.; ...

    2015-09-04

    Modern mass spectrometry and separation techniques have made measurement of major uranium isotope ratios a routine task; however accurate and precise measurement of the minor uranium isotopes remains a challenge as sample size decreases. One particular challenge is the presence of isobaric interferences and their impact on the accuracy of minor isotope 234U and 236U measurements. Furthermore, we present techniques used for routine U isotopic analysis of environmental nuclear safeguards samples and evaluate polyatomic interferences that negatively impact accuracy as well as methods to mitigate their impacts.

  17. Modern Material Analysis Instruments Add a New Dimension to Materials Characterization and Failure Analysis

    NASA Technical Reports Server (NTRS)

    Panda, Binayak

    2009-01-01

    Modern analytical tools can yield invaluable results during materials characterization and failure analysis. Scanning electron microscopes (SEMs) provide significant analytical capabilities, including angstrom-level resolution. These systems can be equipped with a silicon drift detector (SDD) for very fast yet precise analytical mapping of phases, as well as electron back-scattered diffraction (EBSD) units to map grain orientations, chambers that admit large samples, variable pressure for wet samples, and quantitative analysis software to examine phases. Advanced solid-state electronics have also improved surface and bulk analysis instruments: Secondary ion mass spectroscopy (SIMS) can quantitatively determine and map light elements such as hydrogen, lithium, and boron - with their isotopes. Its high sensitivity detects impurities at parts per billion (ppb) levels. X-ray photo-electron spectroscopy (XPS) can determine oxidation states of elements, as well as identifying polymers and measuring film thicknesses on coated composites. This technique is also known as electron spectroscopy for chemical analysis (ESCA). Scanning Auger electron spectroscopy (SAM) combines surface sensitivity, spatial lateral resolution (10 nm), and depth profiling capabilities to describe elemental compositions of near and below surface regions down to the chemical state of an atom.

  18. Technologies of stage magic: Simulation and dissimulation.

    PubMed

    Smith, Wally

    2015-06-01

    The craft of stage magic is presented in this article as a site to study the interplay of people and technology. The focus is on conjuring in the 19th and early 20th centuries, a time when magicians eagerly appropriated new optical, mechanical and electrical technologies into their acts. Also at this time, a modern style of conjuring emerged, characterized by minimal apparatus and a natural manner of performance. Applying Lucy Suchman's perspective of human-machine reconfigurations, conjuring in this modern style is interpreted as an early form of simulation, coupled with techniques of dissimulation. Magicians simulated the presence of supernational agency for public audiences, while dissimulating the underlying methods and mechanisms. Dissimulation implies that the secret inner workings of apparatus were not simply concealed but were rendered absent. This, in turn, obscured the production of supernatural effects in the translation of agencies within an assembly of performers, assistants, apparatus, apparatus-builders, and so on. How this was achieved is investigated through an analysis of key instructional texts written by and for magicians working in the modern style. Techniques of dissimulation are identified in the design of apparatus for three stage illusions, and in the new naturalness of the performer's manner. To explore the significance of this picture of stage magic, and its reliance on techniques of dissimulation, a parallel is drawn between conjuring and recent performances of computerized life forms, especially those of social robotics. The paper concludes by considering what is revealed about the production of agency in stage magic's peculiar human-machine assemblies.

  19. Analysis of view synthesis prediction architectures in modern coding standards

    NASA Astrophysics Data System (ADS)

    Tian, Dong; Zou, Feng; Lee, Chris; Vetro, Anthony; Sun, Huifang

    2013-09-01

    Depth-based 3D formats are currently being developed as extensions to both AVC and HEVC standards. The availability of depth information facilitates the generation of intermediate views for advanced 3D applications and displays, and also enables more efficient coding of the multiview input data through view synthesis prediction techniques. This paper outlines several approaches that have been explored to realize view synthesis prediction in modern video coding standards such as AVC and HEVC. The benefits and drawbacks of various architectures are analyzed in terms of performance, complexity, and other design considerations. It is hence concluded that block-based VSP prediction for multiview video signals provides attractive coding gains with comparable complexity as traditional motion/disparity compensation.

  20. Ulisse Aldrovandi's Pandechion epistemonicon and the use of paper technology in Renaissance natural history.

    PubMed

    Kraemer, Fabian

    2014-01-01

    Reconstructing the formation and use of the hitherto neglected Pandechion epistemonicon, Ulisse Aldrovandi's (152-1605) extant manuscript encyclopaedia, this article shows that early modern naturalists in many ways shared a world of paper with the members of several other professions. An analysis of the Pandechion suggests that Renaissance naturalists who applied the humanist jack-of-all-trades, the commonplace book, in their own field sometimes considerably altered its form. Aldrovandi tested and recombined different techniques so as to arrive at the paper technology that he considered to be the most fit for his purposes. He thereby drew on administrative practices as well as on the bookkeeping practices of early modern merchants that he knew first-hand.

  1. Recent development of electrochemiluminescence sensors for food analysis.

    PubMed

    Hao, Nan; Wang, Kun

    2016-10-01

    Food quality and safety are closely related to human health. In the face of unceasing food safety incidents, various analytical techniques, such as mass spectrometry, chromatography, spectroscopy, and electrochemistry, have been applied in food analysis. High sensitivity usually requires expensive instruments and complicated procedures. Although these modern analytical techniques are sensitive enough to ensure food safety, sometimes their applications are limited because of the cost, usability, and speed of analysis. Electrochemiluminescence (ECL) is a powerful analytical technique that is attracting more and more attention because of its outstanding performance. In this review, the mechanisms of ECL and common ECL luminophores are briefly introduced. Then an overall review of the principles and applications of ECL sensors for food analysis is provided. ECL can be flexibly combined with various separation techniques. Novel materials (e.g., various nanomaterials) and strategies (e.g., immunoassay, aptasensors, and microfluidics) have been progressively introduced into the design of ECL sensors. By illustrating some selected representative works, we summarize the state of the art in the development of ECL sensors for toxins, heavy metals, pesticides, residual drugs, illegal additives, viruses, and bacterias. Compared with other methods, ECL can provide rapid, low-cost, and sensitive detection for various food contaminants in complex matrixes. However, there are also some limitations and challenges. Improvements suited to the characteristics of food analysis are still necessary.

  2. "Reinventing" Techniques for the Estimation of the Area of Irregular Plane Figures: From the Eighteenth Century to the Modern Classroom

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis

    2010-01-01

    The issue of the area of irregular shapes is absent from the modern mathematical textbooks in elementary education in Greece. However, there exists a collection of books written for educational purposes by famous Greek scholars dating from the eighteenth century, which propose certain techniques concerning the estimation of the area of such…

  3. OPERATIONS RESEARCH IN THE DESIGN OF MANAGEMENT INFORMATION SYSTEMS

    DTIC Science & Technology

    management information systems is concerned with the identification and detailed specification of the information and data processing...of advanced data processing techniques in management information systems today, the close coordination of operations research and data systems activities has become a practical necessity for the modern business firm.... information systems in which mathematical models are employed as the basis for analysis and systems design. Operations research provides a

  4. Tin Oxide Chemistry from Macquer (1758) to Mendeleeff (1891) as Revealed in the Textbooks and Other Literature of the Era

    ERIC Educational Resources Information Center

    de Berg, Kevin C.

    2008-01-01

    Eight chemistry textbooks written from 1758 to 1891 have been analyzed for the way they present the chemistry of the oxides of tin. This analysis gives insight into the foundation of a number of chemical ideas such as nomenclature and composition used in modern chemistry. Four major preparation techniques for the production of tin oxides emerge…

  5. Civil and mechanical engineering applications of sensitivity analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komkov, V.

    1985-07-01

    In this largely tutorial presentation, the historical development of optimization theories has been outlined as they applied to mechanical and civil engineering designs and the development of modern sensitivity techniques during the last 20 years has been traced. Some of the difficulties and the progress made in overcoming them have been outlined. Some of the recently developed theoretical methods have been stressed to indicate their importance to computer-aided design technology.

  6. Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.

    PubMed

    Segovia, F; Górriz, J M; Ramírez, J; Phillips, C

    2016-01-01

    Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.

  7. Wide-field optical mapping of neural activity and brain haemodynamics: considerations and novel approaches

    PubMed Central

    Ma, Ying; Shaik, Mohammed A.; Kozberg, Mariel G.; Thibodeaux, David N.; Zhao, Hanzhi T.; Yu, Hang

    2016-01-01

    Although modern techniques such as two-photon microscopy can now provide cellular-level three-dimensional imaging of the intact living brain, the speed and fields of view of these techniques remain limited. Conversely, two-dimensional wide-field optical mapping (WFOM), a simpler technique that uses a camera to observe large areas of the exposed cortex under visible light, can detect changes in both neural activity and haemodynamics at very high speeds. Although WFOM may not provide single-neuron or capillary-level resolution, it is an attractive and accessible approach to imaging large areas of the brain in awake, behaving mammals at speeds fast enough to observe widespread neural firing events, as well as their dynamic coupling to haemodynamics. Although such wide-field optical imaging techniques have a long history, the advent of genetically encoded fluorophores that can report neural activity with high sensitivity, as well as modern technologies such as light emitting diodes and sensitive and high-speed digital cameras have driven renewed interest in WFOM. To facilitate the wider adoption and standardization of WFOM approaches for neuroscience and neurovascular coupling research, we provide here an overview of the basic principles of WFOM, considerations for implementation of wide-field fluorescence imaging of neural activity, spectroscopic analysis and interpretation of results. This article is part of the themed issue ‘Interpreting BOLD: a dialogue between cognitive and cellular neuroscience’. PMID:27574312

  8. Vibration-Based Method Developed to Detect Cracks in Rotors During Acceleration Through Resonance

    NASA Technical Reports Server (NTRS)

    Sawicki, Jerzy T.; Baaklini, George Y.; Gyekenyesi, Andrew L.

    2004-01-01

    In recent years, there has been an increasing interest in developing rotating machinery shaft crack-detection methodologies and online techniques. Shaft crack problems present a significant safety and loss hazard in nearly every application of modern turbomachinery. In many cases, the rotors of modern machines are rapidly accelerated from rest to operating speed, to reduce the excessive vibrations at the critical speeds. The vibration monitoring during startup or shutdown has been receiving growing attention (ref. 1), especially for machines such as aircraft engines, which are subjected to frequent starts and stops, as well as high speeds and acceleration rates. It has been recognized that the presence of angular acceleration strongly affects the rotor's maximum response to unbalance and the speed at which it occurs. Unfortunately, conventional nondestructive evaluation (NDE) methods have unacceptable limits in terms of their application for online crack detection. Some of these techniques are time consuming and inconvenient for turbomachinery service testing. Almost all of these techniques require that the vicinity of the damage be known in advance, and they can provide only local information, with no indication of the structural strength at a component or system level. In addition, the effectiveness of these experimental techniques is affected by the high measurement noise levels existing in complex turbomachine structures. Therefore, the use of vibration monitoring along with vibration analysis has been receiving increasing attention.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spong, D.A.

    The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a varietymore » of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamışlıoğlu, Miraç, E-mail: m.kamislioglu@gmail.com; Külahcı, Fatih, E-mail: fatihkulahci@firat.edu.tr

    Nonlinear time series analysis techniques have large application areas on the geoscience and geophysics fields. Modern nonlinear methods are provided considerable evidence for explain seismicity phenomena. In this study nonlinear time series analysis, fractal analysis and spectral analysis have been carried out for researching the chaotic behaviors of release radon gas ({sup 222}Rn) concentration occurring during seismic events. Nonlinear time series analysis methods (Lyapunov exponent, Hurst phenomenon, correlation dimension and false nearest neighbor) were applied for East Anatolian Fault Zone (EAFZ) Turkey and its surroundings where there are about 35,136 the radon measurements for each region. In this paper weremore » investigated of {sup 222}Rn behavior which it’s used in earthquake prediction studies.« less

  11. Fluorescence analysis of ubiquinone and its application in quality control of medical supplies

    NASA Astrophysics Data System (ADS)

    Timofeeva, Elvira O.; Gorbunova, Elena V.; Chertov, Aleksandr N.

    2017-02-01

    The presence of antioxidant issues such as redox potential imbalance in human body is a very important question for modern clinical diagnostics. Implementation of fluorescence analysis into optical diagnostics of such wide distributed in a human body antioxidant as ubiquinone is one of the steps for development of the device with a view to clinical diagnostics of redox potential. Recording of fluorescence was carried out with spectrometer using UV irradiation source with thin band (max at 287 and 330 nm) as a background radiation. Concentrations of ubiquinone from 0.25 to 2.5 mmol/l in explored samples were used for investigation. Recording data was processed using correlation analysis and differential analytical technique. The fourth derivative spectrum of fluorescence spectrum provided the basis for a multicomponent analysis of the solutions. As a technique in clinical diagnostics fluorescence analysis with processing method including differential spectrophotometry, it is step forward towards redox potential calculation and quality control in pharmacy for better health care.

  12. On three-dimensional misorientation spaces

    PubMed Central

    Bennett, Robbie J.; Vukmanovic, Zoja; Solano-Alvarez, Wilberth; Lainé, Steven J.; Einsle, Joshua F.; Midgley, Paul A.; Rae, Catherine M. F.; Hielscher, Ralf

    2017-01-01

    Determining the local orientation of crystals in engineering and geological materials has become routine with the advent of modern crystallographic mapping techniques. These techniques enable many thousands of orientation measurements to be made, directing attention towards how such orientation data are best studied. Here, we provide a guide to the visualization of misorientation data in three-dimensional vector spaces, reduced by crystal symmetry, to reveal crystallographic orientation relationships. Domains for all point group symmetries are presented and an analysis methodology is developed and applied to identify crystallographic relationships, indicated by clusters in the misorientation space, in examples from materials science and geology. This analysis aids the determination of active deformation mechanisms and evaluation of cluster centres and spread enables more accurate description of transformation processes supporting arguments regarding provenance. PMID:29118660

  13. Light Water Reactor Sustainability Program A Reference Plan for Control Room Modernization: Planning and Analysis Phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques Hugo; Ronald Boring; Lew Hanes

    2013-09-01

    The U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) program is collaborating with a U.S. nuclear utility to bring about a systematic fleet-wide control room modernization. To facilitate this upgrade, a new distributed control system (DCS) is being introduced into the control rooms of these plants. The DCS will upgrade the legacy plant process computer and emergency response facility information system. In addition, the DCS will replace an existing analog turbine control system with a display-based system. With technology upgrades comes the opportunity to improve the overall human-system interaction between the operators and the control room. To optimize operatormore » performance, the LWRS Control Room Modernization research team followed a human-centered approach published by the U.S. Nuclear Regulatory Commission. NUREG-0711, Rev. 3, Human Factors Engineering Program Review Model (O’Hara et al., 2012), prescribes four phases for human factors engineering. This report provides examples of the first phase, Planning and Analysis. The three elements of Planning and Analysis in NUREG-0711 that are most crucial to initiating control room upgrades are: • Operating Experience Review: Identifies opportunities for improvement in the existing system and provides lessons learned from implemented systems. • Function Analysis and Allocation: Identifies which functions at the plant may be optimally handled by the DCS vs. the operators. • Task Analysis: Identifies how tasks might be optimized for the operators. Each of these elements is covered in a separate chapter. Examples are drawn from workshops with reactor operators that were conducted at the LWRS Human System Simulation Laboratory HSSL and at the respective plants. The findings in this report represent generalized accounts of more detailed proprietary reports produced for the utility for each plant. The goal of this LWRS report is to disseminate the technique and provide examples sufficient to serve as a template for other utilities’ projects for control room modernization.« less

  14. Feasibility of Tactical Air Delivery Resupply Using Gliders

    DTIC Science & Technology

    2016-12-01

    using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...Sparrow,” using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...the desired point(s) of impact due to the atmospheric three-dimensional ( 3D ) wind and density field encountered by the descending load under canopy

  15. Modern Education in China. Bulletin, 1919, No. 44

    ERIC Educational Resources Information Center

    Edmunds, Charles K.

    1919-01-01

    The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…

  16. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  17. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  18. A profile of the demographics and training characteristics of professional modern dancers.

    PubMed

    Weiss, David S; Shah, Selina; Burchette, Raoul J

    2008-01-01

    Modern dancers are a unique group of artists, performing a diverse repertoire in dance companies of various sizes. In this study, 184 professional modern dancers in the United States (males N=49, females N=135), including members of large and small companies as well as freelance dancers, were surveyed regarding their demographics and training characteristics. The mean age of the dancers was 30.1 +/- 7.3 years, and they had danced professionally for 8.9 +/- 7.2 years. The average Body Mass Index (BMI) was 23.6 +/- 2.4 for males and 20.5 +/- 1.7 for females. Females had started taking dance class earlier (age 6.5 +/- 4.2 years) as compared to males (age 15.6 +/- 6.2 years). Females were more likely to have begun their training in ballet, while males more often began with modern classes (55% and 51% respectively, p < 0.0001). The professional modern dancers surveyed spent 8.3 +/- 6.0 hours in class and 17.2 +/- 12.6 hours in rehearsal each week. Eighty percent took modern technique class and 67% reported that they took ballet technique class. The dancers who specified what modern technique they studied (N=84) reported between two and four different techniques. The dancers also participated in a multitude of additional exercise regimens for a total of 8.2 +/- 6.6 hours per week, with the most common types being Pilates, yoga, and upper body weightlifting. The dancers wore many different types of footwear, depending on the style of dance being performed. For modern dance alone, dancers wore 12 different types of footwear. Reflecting the diversity of the dancers and companies surveyed, females reported performing for 23.3 +/- 14.0 weeks (range: 2-52 weeks) per year; males reported performing 20.4 +/- 13.9 weeks (range: 1-40) per year. Only 18% of the dancers did not have any health insurance, with 54% having some type of insurance provided by their employer. However, 23% of the dancers purchased their own insurance, and 22% had insurance provided by their families. Only 16% of dancers reported that they had Workers' Compensation coverage, despite the fact that they were all professionals, including many employed by major modern dance companies across the United States. It is concluded that understanding the training profile of the professional modern dancer should assist healthcare providers in supplying appropriate medical care for these performers.

  19. Looking ahead in systems engineering

    NASA Technical Reports Server (NTRS)

    Feigenbaum, Donald S.

    1966-01-01

    Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.

  20. Hyperspectral imaging coupled with chemometric analysis for non-invasive differentiation of black pens

    NASA Astrophysics Data System (ADS)

    Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna

    2016-11-01

    Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.

  1. AMS-14C analysis of modern teeth: A comparison between two sample preparation techniques

    NASA Astrophysics Data System (ADS)

    Solis, C.; Solis-Meza, E.; Morales, M. E.; Rodriguez-Ceja, M.; Martínez-Carrillo, M. A.; Garcia-Calderon, D.; Huerta, A.; Chávez, E.

    2017-09-01

    AMS-14C analysis of modern teeth has become important for forensic studies. 14C content in human teeth reflects the 14C atmospheric concentration during its formation and allows the calculation of the actual year of birth. Through AMS, it is possible to measure the 14C concentrations in a tissue with high precision. However, there is a debate about which should be the best fraction for teeth carbon dating: collagen or enamel. This work focuses on the results obtained from enamel and collagen extracted from Mexican people in order to compare them. Collagen from dental pieces donated from people older than 60-years-old have been included to understand the turnover process and usefulness of collagen to determine the date of birth. Our results indicate that when a single dental piece is available, enamel method allows the determination of the tooth formation date. Dating collagen of the same tooth helps to discriminate if the formation date belongs to the left or the right side of the peak bomb, but also corroborates, the ages obtained through enamel analysis.

  2. Advances in numerical and applied mathematics

    NASA Technical Reports Server (NTRS)

    South, J. C., Jr. (Editor); Hussaini, M. Y. (Editor)

    1986-01-01

    This collection of papers covers some recent developments in numerical analysis and computational fluid dynamics. Some of these studies are of a fundamental nature. They address basic issues such as intermediate boundary conditions for approximate factorization schemes, existence and uniqueness of steady states for time dependent problems, and pitfalls of implicit time stepping. The other studies deal with modern numerical methods such as total variation diminishing schemes, higher order variants of vortex and particle methods, spectral multidomain techniques, and front tracking techniques. There is also a paper on adaptive grids. The fluid dynamics papers treat the classical problems of imcompressible flows in helically coiled pipes, vortex breakdown, and transonic flows.

  3. Computer vision applications for coronagraphic optical alignment and image processing.

    PubMed

    Savransky, Dmitry; Thomas, Sandrine J; Poyneer, Lisa A; Macintosh, Bruce A

    2013-05-10

    Modern coronagraphic systems require very precise alignment between optical components and can benefit greatly from automated image processing. We discuss three techniques commonly employed in the fields of computer vision and image analysis as applied to the Gemini Planet Imager, a new facility instrument for the Gemini South Observatory. We describe how feature extraction and clustering methods can be used to aid in automated system alignment tasks, and also present a search algorithm for finding regular features in science images used for calibration and data processing. Along with discussions of each technique, we present our specific implementation and show results of each one in operation.

  4. The design of a turboshaft speed governor using modern control techniques

    NASA Technical Reports Server (NTRS)

    Delosreyes, G.; Gouchoe, D. R.

    1986-01-01

    The objectives of this program were: to verify the model of off schedule compressor variable geometry in the T700 turboshaft engine nonlinear model; to evaluate the use of the pseudo-random binary noise (PRBN) technique for obtaining engine frequency response data; and to design a high performance power turbine speed governor using modern control methods. Reduction of T700 engine test data generated at NASA-Lewis indicated that the off schedule variable geometry effects were accurate as modeled. Analysis also showed that the PRBN technique combined with the maximum likelihood model identification method produced a Bode frequency response that was as accurate as the response obtained from standard sinewave testing methods. The frequency response verified the accuracy of linear models consisting of engine partial derivatives and used for design. A power turbine governor was designed using the Linear Quadratic Regulator (LQR) method of full state feedback control. A Kalman filter observer was used to estimate helicopter main rotor blade velocity. Compared to the baseline T700 power turbine speed governor, the LQR governor reduced droop up to 25 percent for a 490 shaft horsepower transient in 0.1 sec simulating a wind gust, and up to 85 percent for a 700 shaft horsepower transient in 0.5 sec simulating a large collective pitch angle transient.

  5. [Aerobic methylobacteria as promising objects of modern biotechnology].

    PubMed

    Doronina, N V; Toronskava, L; Fedorov, D N; Trotsenko, Yu A

    2015-01-01

    The experimental data of the past decade concerning the metabolic peculiarities of aerobic meth ylobacteria and the prospects for their use in different fields of modern biotechnology, including genetic engineering techniques, have been summarized.

  6. Compilation of Abstracts of Theses Submitted by Candidates for Degrees

    DTIC Science & Technology

    1987-09-30

    Paral- lel, Multiple Backend Database Systems Feudo, C.V. Modern Hardware Tochnololies 88 MAJ , USA 8nd. Sof ware Techniques for Online uatabase Storage...and itsApplication in the War- gaming , Reseamth and Analysis (W.A.R.) Lab Waltens erger, G.M. On Limited War, Escalation 524 CPT,, USRF Control, and...TECHNIQIUES FOR ONLINE DATABASE ,TORAGE AND ACCESS Christopher V. Feudo Ma or, United States Army B.S., United States Military Academy# 1972

  7. Role of U.S. Security Assistance in Modernizing the Portuguese Armed Forces: A Historical Analysis.

    DTIC Science & Technology

    1986-09-01

    Portuguese Air Force Fiscal Year 1986 IMET/FMS Training Program Security Assistance Management Manual * "Portuguese Navy: A Naval Fleet that is...of the techniques of fiscal management and, within the limits that he had set for the regime, his program of economic recovery succeeded .... What...1984). Currency: Escudo * Agriculture: generally developed; 8.8,% of GDP; main crops - grains, potatoes, olives, grapes (wine); deficit foods - sugar

  8. Chemical signatures of fossilized resins and recent plant exudates.

    PubMed

    Lambert, Joseph B; Santiago-Blay, Jorge A; Anderson, Ken B

    2008-01-01

    Amber is one of the few gemstones based on an organic structure. Found over most of the world, it is the fossil form of sticky plant exudates called resins. Investigation of amber by modern analytical techniques provides structural information and insight into the identity of the ancient plants that produced the source resin. Mass spectrometric analysis of materials separated by gas chromatography has identified specific compounds that are the basis of a reliable classification of the different types of amber. NMR spectroscopy of bulk, solid amber provides a complementary classification. NMR spectroscopy also can be used to characterize modern resins as well as other types of plant exudates such as gums, gum resins, and kinos, which strongly resemble resins in appearance but have very different molecular constitutions.

  9. Decomposition techniques

    USGS Publications Warehouse

    Chao, T.T.; Sanzolone, R.F.

    1992-01-01

    Sample decomposition is a fundamental and integral step in the procedure of geochemical analysis. It is often the limiting factor to sample throughput, especially with the recent application of the fast and modern multi-element measurement instrumentation. The complexity of geological materials makes it necessary to choose the sample decomposition technique that is compatible with the specific objective of the analysis. When selecting a decomposition technique, consideration should be given to the chemical and mineralogical characteristics of the sample, elements to be determined, precision and accuracy requirements, sample throughput, technical capability of personnel, and time constraints. This paper addresses these concerns and discusses the attributes and limitations of many techniques of sample decomposition along with examples of their application to geochemical analysis. The chemical properties of reagents as to their function as decomposition agents are also reviewed. The section on acid dissolution techniques addresses the various inorganic acids that are used individually or in combination in both open and closed systems. Fluxes used in sample fusion are discussed. The promising microwave-oven technology and the emerging field of automation are also examined. A section on applications highlights the use of decomposition techniques for the determination of Au, platinum group elements (PGEs), Hg, U, hydride-forming elements, rare earth elements (REEs), and multi-elements in geological materials. Partial dissolution techniques used for geochemical exploration which have been treated in detail elsewhere are not discussed here; nor are fire-assaying for noble metals and decomposition techniques for X-ray fluorescence or nuclear methods be discussed. ?? 1992.

  10. 20th International Conference for Students and Young Scientists: Modern Techniques and Technologies (MTT'2014)

    NASA Astrophysics Data System (ADS)

    2014-10-01

    The active involvement of young researchers in scientific processes and the acquisition of scientific experience by gifted youth currently have a great value for the development of science. One of the research activities of National Research Tomsk Polytechnic University, aimed at the preparing and formation of the next generation of scientists, is the International Conference of Students and Young Scientists ''Modern Techniques and Technologies'', which was held in 2014 for the twentieth time. Great experience in the organization of scientific events has been acquired through years of carrying the conference. There are all the necessary resources for this: a team of organizers - employees of Tomsk Polytechnic University, premises provided with modern office equipment and equipment for demonstration, and leading scientists - professors of TPU, as well as the status of the university as a leading research university in Russia. This way the conference is able to attract world leading scientists for the collaboration. For the previous years the conference proved itself as a major scientific event at international level, which attracts more than 600 students and young scientists from Russia, CIS and other countries. The conference provides oral plenary and section reports. The conference is organized around lectures, where leading Russian and foreign scientists deliver plenary presentations to young audiences. An important indicator of this scientific event is the magnitude of the coverage of scientific fields: energy, heat and power, instrument making, engineering, systems and devices for medical purposes, electromechanics, material science, computer science and control in technical systems, nanotechnologies and nanomaterials, physical methods in science and technology, control and quality management, design and technology of artistic materials processing. The main issues considered by young researchers at the conference were related to the analysis of contemporary problems using new techniques and application of new technologies.

  11. [THE TECHNOLOGY "CELL BLOCK" IN CYTOLOGICAL PRACTICE].

    PubMed

    Volchenko, N N; Borisova, O V; Baranova, I B

    2015-08-01

    The article presents summary information concerning application of "cell block" technology in cytological practice. The possibilities of implementation of various modern techniques (immune cytochemnical analysis. FISH, CISH, polymerase chain reaction) with application of "cell block" method are demonstrated. The original results of study of "cell block" technology made with gelatin, AgarCyto and Shadon Cyoblock set are presented. The diagnostic effectiveness of "cell block" technology and common cytological smear and also immune cytochemical analysis on samples of "cell block" technology and fluid cytology were compared. Actually application of "cell block" technology is necessary for ensuring preservation of cell elements for subsequent immune cytochemical and molecular genetic analysis.

  12. Impact of the macroeconomic factors on university budgeting the US and Russia

    NASA Astrophysics Data System (ADS)

    Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly

    2017-10-01

    This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.

  13. Image analysis and machine learning for detecting malaria.

    PubMed

    Poostchi, Mahdieh; Silamut, Kamolrat; Maude, Richard J; Jaeger, Stefan; Thoma, George

    2018-04-01

    Malaria remains a major burden on global health, with roughly 200 million cases worldwide and more than 400,000 deaths per year. Besides biomedical research and political efforts, modern information technology is playing a key role in many attempts at fighting the disease. One of the barriers toward a successful mortality reduction has been inadequate malaria diagnosis in particular. To improve diagnosis, image analysis software and machine learning methods have been used to quantify parasitemia in microscopic blood slides. This article gives an overview of these techniques and discusses the current developments in image analysis and machine learning for microscopic malaria diagnosis. We organize the different approaches published in the literature according to the techniques used for imaging, image preprocessing, parasite detection and cell segmentation, feature computation, and automatic cell classification. Readers will find the different techniques listed in tables, with the relevant articles cited next to them, for both thin and thick blood smear images. We also discussed the latest developments in sections devoted to deep learning and smartphone technology for future malaria diagnosis. Published by Elsevier Inc.

  14. Challenge Paper: Validation of Forensic Techniques for Criminal Prosecution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert F.; Endicott-Popovsky, Barbara E.; Frincke, Deborah A.

    2007-04-10

    Abstract: As in many domains, there is increasing agreement in the user and research community that digital forensics analysts would benefit from the extension, development and application of advanced techniques in performing large scale and heterogeneous data analysis. Modern digital forensics analysis of cyber-crimes and cyber-enabled crimes often requires scrutiny of massive amounts of data. For example, a case involving network compromise across multiple enterprises might require forensic analysis of numerous sets of network logs and computer hard drives, potentially involving 100?s of gigabytes of heterogeneous data, or even terabytes or petabytes of data. Also, the goal for forensic analysismore » is to not only determine whether the illicit activity being considered is taking place, but also to identify the source of the activity and the full extent of the compromise or impact on the local network. Even after this analysis, there remains the challenge of using the results in subsequent criminal and civil processes.« less

  15. Comparison of extraction techniques of robenidine from poultry feed samples.

    PubMed

    Wilga, Joanna; Wasik, Agata Kot-; Namieśnik, Jacek

    2007-10-31

    In this paper, effectiveness of six different commonly applied extraction techniques for the determination of robenidine in poultry feed has been compared. The sample preparation techniques included shaking, Soxhlet, Soxtec, ultrasonically assisted extraction, microwave - assisted extraction and accelerated solvent extraction. Comparison of these techniques was done with respect to the recovery extraction, temperature and time, reproducibility and solvent consumption. Every single extract was subjected to clean - up using aluminium oxide column (Pasteur pipette filled with 1g of aluminium oxide), from which robenidine was eluted with 10ml of methanol. The eluate from the clean-up column was collected in a volumetric flask, and finally it was analysed by HPLC-DAD-MS. In general, all extraction techniques were capable of isolating of robenidine from poultry feed, but the recovery obtained using modern extraction techniques was higher than that obtained using conventional techniques. In particular, accelerated solvent extraction was more superior to other techniques, which highlights the advantages of this sample preparation technique. However, in routine analysis, shaking and ultrasonically assisted extraction is still the preferred method for the solution of robenidine and other coccidiostatics.

  16. Nanoscale infrared spectroscopy as a non-destructive probe of extraterrestrial samples.

    PubMed

    Dominguez, Gerardo; Mcleod, A S; Gainsforth, Zack; Kelly, P; Bechtel, Hans A; Keilmann, Fritz; Westphal, Andrew; Thiemens, Mark; Basov, D N

    2014-12-09

    Advances in the spatial resolution of modern analytical techniques have tremendously augmented the scientific insight gained from the analysis of natural samples. Yet, while techniques for the elemental and structural characterization of samples have achieved sub-nanometre spatial resolution, infrared spectral mapping of geochemical samples at vibrational 'fingerprint' wavelengths has remained restricted to spatial scales >10 μm. Nevertheless, infrared spectroscopy remains an invaluable contactless probe of chemical structure, details of which offer clues to the formation history of minerals. Here we report on the successful implementation of infrared near-field imaging, spectroscopy and analysis techniques capable of sub-micron scale mineral identification within natural samples, including a chondrule from the Murchison meteorite and a cometary dust grain (Iris) from NASA's Stardust mission. Complementary to scanning electron microscopy, energy-dispersive X-ray spectroscopy and transmission electron microscopy probes, this work evidences a similarity between chondritic and cometary materials, and inaugurates a new era of infrared nano-spectroscopy applied to small and invaluable extraterrestrial samples.

  17. Understanding PGM-free Catalysts by Linking Density Functional Theory Calculations and Structural Analysis: Perspectives and Challenges

    DOE PAGES

    Gonzales, Ivana; Artyushkova, Kateryna; Atanassov, Plamen

    2018-03-13

    Here, we discuss perspectives and challenges in applying density functional theory for the calculation of spectroscopic properties of platinum group metal (PGM)-free electrocatalysts for oxygen reduction. More specifically, we discuss recent advances in the density functional theory calculations of core-level shifts in binding energies of N 1s electrons as measured by X-ray photoelectron spectroscopy. The link between the density functional theory calculations, the electrocatalytic performance of the catalysts, and structural analysis using modern spectroscopic techniques is expected to significantly increase our understanding of PGM-free catalysts at the molecular level.

  18. Arthrodesis following failed total knee arthroplasty: comprehensive review and meta-analysis of recent literature.

    PubMed

    Damron, T A; McBeath, A A

    1995-04-01

    With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.

  19. Understanding PGM-free Catalysts by Linking Density Functional Theory Calculations and Structural Analysis: Perspectives and Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzales, Ivana; Artyushkova, Kateryna; Atanassov, Plamen

    Here, we discuss perspectives and challenges in applying density functional theory for the calculation of spectroscopic properties of platinum group metal (PGM)-free electrocatalysts for oxygen reduction. More specifically, we discuss recent advances in the density functional theory calculations of core-level shifts in binding energies of N 1s electrons as measured by X-ray photoelectron spectroscopy. The link between the density functional theory calculations, the electrocatalytic performance of the catalysts, and structural analysis using modern spectroscopic techniques is expected to significantly increase our understanding of PGM-free catalysts at the molecular level.

  20. Apollo management: A key to the solution of the social-economical dilemma - The transferability of space-travel managerial techniques to the civil sector

    NASA Technical Reports Server (NTRS)

    Puttkamer, J. V.

    1973-01-01

    An analysis has been conducted to find out whether the management techniques developed in connection with the Apollo project could be used for dealing with such urgent problems of modern society as the crisis of the cities, the increasing environmental pollution, and the steadily growing traffic. Basic concepts and definitions of program and system management are discussed together with details regarding the employment of these concepts in connection with the solution of the problems of the Apollo program. Principles and significance of a systems approach are considered, giving attention to planning, system analysis, system integration, and project management. An application of the methods of project management to the problems of the civil sector is possible if the special characteristics of each particular case are taken into account.

  1. [Application of Finite Element Method in Thoracolumbar Spine Traumatology].

    PubMed

    Zhang, Min; Qiu, Yong-gui; Shao, Yu; Gu, Xiao-feng; Zeng, Ming-wei

    2015-04-01

    The finite element method (FEM) is a mathematical technique using modern computer technology for stress analysis, and has been gradually used in simulating human body structures in the biomechanical field, especially more widely used in the research of thoracolumbar spine traumatology. This paper reviews the establishment of the thoracolumbar spine FEM, the verification of the FEM, and the thoracolumbar spine FEM research status in different fields, and discusses its prospects and values in forensic thoracolumbar traumatology.

  2. The influence of surface finishing methods on touch-sensitive reactions

    NASA Astrophysics Data System (ADS)

    Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.

    2017-02-01

    This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.

  3. Reirradiation of head and neck cancer using modern highly conformal techniques.

    PubMed

    Ho, Jennifer C; Phan, Jack

    2018-04-23

    Locoregional disease recurrence or development of a second primary cancer after definitive radiotherapy for head and neck cancers remains a treatment challenge. Reirradiation utilizing traditional techniques has been limited by concern for serious toxicity. With the advent of newer, more precise radiotherapy techniques, such as intensity-modulated radiotherapy (IMRT), proton radiotherapy, and stereotactic body radiotherapy (SBRT), there has been renewed interest in curative-intent head and neck reirradiation. However, as most studies were retrospective, single-institutional experiences, the optimal modality is not clear. We provide a comprehensive review of the outcomes of relevant studies using these 3 head and neck reirradiation techniques, followed by an analysis and comparison of the toxicity, tumor control, concurrent systemic therapy, and prognostic factors. Overall, there is evidence that IMRT, proton therapy, and SBRT reirradiation are feasible treatment options that offer a chance for durable local control and survival. Prospective studies, particularly randomized trials, are needed. © 2018 Wiley Periodicals, Inc.

  4. Development of modern human subadult age and sex estimation standards using multi-slice computed tomography images from medical examiner's offices

    NASA Astrophysics Data System (ADS)

    Stock, Michala K.; Stull, Kyra E.; Garvin, Heather M.; Klales, Alexandra R.

    2016-10-01

    Forensic anthropologists are routinely asked to estimate a biological profile (i.e., age, sex, ancestry and stature) from a set of unidentified remains. In contrast to the abundance of collections and techniques associated with adult skeletons, there is a paucity of modern, documented subadult skeletal material, which limits the creation and validation of appropriate forensic standards. Many are forced to use antiquated methods derived from small sample sizes, which given documented secular changes in the growth and development of children, are not appropriate for application in the medico-legal setting. Therefore, the aim of this project is to use multi-slice computed tomography (MSCT) data from a large, diverse sample of modern subadults to develop new methods to estimate subadult age and sex for practical forensic applications. The research sample will consist of over 1,500 full-body MSCT scans of modern subadult individuals (aged birth to 20 years) obtained from two U.S. medical examiner's offices. Statistical analysis of epiphyseal union scores, long bone osteometrics, and os coxae landmark data will be used to develop modern subadult age and sex estimation standards. This project will result in a database of information gathered from the MSCT scans, as well as the creation of modern, statistically rigorous standards for skeletal age and sex estimation in subadults. Furthermore, the research and methods developed in this project will be applicable to dry bone specimens, MSCT scans, and radiographic images, thus providing both tools and continued access to data for forensic practitioners in a variety of settings.

  5. Locality-Aware CTA Clustering For Modern GPUs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Ang; Song, Shuaiwen; Liu, Weifeng

    2017-04-08

    In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less

  6. Modern adjuncts and technologies in microsurgery: an historical and evidence-based review.

    PubMed

    Pratt, George F; Rozen, Warren M; Chubb, Daniel; Whitaker, Iain S; Grinsell, Damien; Ashton, Mark W; Acosta, Rafael

    2010-11-01

    While modern reconstructive surgery was revolutionized with the introduction of microsurgical techniques, microsurgery itself has seen the introduction of a range of technological aids and modern techniques aiming to improve dissection times, anastomotic times, and overall outcomes. These include improved preoperative planning, anastomotic aides, and earlier detection of complications with higher salvage rates. Despite the potential for substantial impact, many of these techniques have been evaluated in a limited fashion, and the evidence for each has not been universally explored. The purpose of this review was to establish and quantify the evidence for each technique. A search of relevant medical databases was performed to identify literature providing evidence for each technology. Levels of evidence were thus accumulated and applied to each technique. There is a relative paucity of evidence for many of the more recent technologies described in the field of microsurgery, with no randomized controlled trials, and most studies in the field comprising case series only. Current evidence-based suggestions include the use of computed tomographic angiography (CTA) for the preoperative planning of perforator flaps, the intraoperative use of a mechanical anastomotic coupling aide (particularly the Unilink® coupler), and postoperative flap monitoring with strict protocols using clinical bedside monitoring and/or the implantable Doppler probe. Despite the breadth of technologies introduced into the field of microsurgery, there is substantial variation in the degree of evidence presented for each, suggesting the role for much future research, particularly from emerging technologies such as robotics and modern simulators. Copyright © 2010 Wiley-Liss, Inc.

  7. A Survey of Architectural Techniques For Improving Cache Power Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh

    Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-worldmore » processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.« less

  8. [Modern bacterial taxonomy: techniques review--application to bacteria that nodulate leguminous plants (BNL)].

    PubMed

    Zakhia, Frédéric; de Lajudie, Philippe

    2006-03-01

    Taxonomy is the science that studies the relationships between organisms. It comprises classification, nomenclature, and identification. Modern bacterial taxonomy is polyphasic. This means that it is based on several molecular techniques, each one retrieving the information at different cellular levels (proteins, fatty acids, DNA...). The obtained results are combined and analysed to reach a "consensus taxonomy" of a microorganism. Until 1970, a small number of classification techniques were available for microbiologists (mainly phenotypic characterization was performed: a legume species nodulation ability for a Rhizobium, for example). With the development of techniques based on polymerase chain reaction for characterization, the bacterial taxonomy has undergone great changes. In particular, the classification of the legume nodulating bacteria has been repeatedly modified over the last 20 years. We present here a review of the currently used molecular techniques in bacterial characterization, with examples of application of these techniques for the study of the legume nodulating bacteria.

  9. Seeing the Forest and the Trees: Western Forestry Systems and Soviet Engineers, 1955-1964.

    PubMed

    Kochetkova, Elena

    This article examines the transfer of technology from Finnish enterprises to Soviet industry during the USSR's period of technological modernization between 1955 and 1964. It centers on the forestry sector, which was a particular focus of modernization programs and a key area for the transfer of foreign techniques and expertise. The aim of the article is to investigate the role of trips made by Soviet specialists to foreign (primarily Finnish) enterprises in order to illustrate the nontechnological influences that occurred during the transfer of technologies across the cold war border. To do so, the article is divided into two parts: the first presents a general analysis of technology transfer from a micro-level perspective, while the second investigates the cultural influences behind technological transfer in the Soviet-Finnish case. This study contends that although the Soviet government expected its specialists to import advanced foreign technical experience, they brought not only the technologies and expertise needed for modernizing the industry, but also a changed view on Soviet workplace management and everyday practices.

  10. Symmetry-Based Techniques for Qualitative Understanding of Rovibrational Effects in Spherical-Top Molecular Spectra and Dynamics

    NASA Astrophysics Data System (ADS)

    Mitchell, Justin Chadwick

    2011-12-01

    Using light to probe the structure of matter is as natural as opening our eyes. Modern physics and chemistry have turned this art into a rich science, measuring the delicate interactions possible at the molecular level. Perhaps the most commonly used tool in computational spectroscopy is that of matrix diagonalization. While this is invaluable for calculating everything from molecular structure and energy levels to dipole moments and dynamics, the process of numerical diagonalization is an opaque one. This work applies symmetry and semi-classical techniques to elucidate numerical spectral analysis for high-symmetry molecules. Semi-classical techniques, such as the Potential Energy Surfaces, have long been used to help understand molecular vibronic and rovibronic spectra and dynamics. This investigation focuses on newer semi-classical techniques that apply Rotational Energy Surfaces (RES) to rotational energy level clustering effects in high-symmetry molecules. Such clusters exist in rigid rotor molecules as well as deformable spherical tops. This study begins by using the simplicity of rigid symmetric top molecules to clarify the classical-quantum correspondence of RES semi-classical analysis and then extends it to a more precise and complete theory of modern high-resolution spectra. RES analysis is extended to molecules having more complex and higher rank tensorial rotational and rovibrational Hamiltonians than were possible to understand before. Such molecules are shown to produce an extraordinary range of rotational level clusters, corresponding to a panoply of symmetries ranging from C4v to C2 and C1 (no symmetry) with a corresponding range of new angular momentum localization and J-tunneling effects. Using RES topography analysis and the commutation duality relations between symmetry group operators in the lab-frame to those in the body-frame, it is shown how to better describe and catalog complex splittings found in rotational level clusters. Symmetry character analysis is generalized to give analytic eigensolutions. An appendix provides vibrational analogies. For the first time, interactions between molecular vibrations (polyads) are described semi-classically by multiple RES. This is done for the nu 3/2nu4 dyad of CF4. The nine-surface RES topology of the U(9)-dyad agrees with both computational and experimental work. A connection between this and a simpler U(2) example is detailed in an Appendix.

  11. A text comprehension approach to questionnaire readability: An example using gambling disorder measures.

    PubMed

    Peter, Samuel C; Whelan, James P; Pfund, Rory A; Meyers, Andrew W

    2018-06-14

    Although readability has been traditionally operationalized and even become synonymous with the concept of word and sentence length, modern text analysis theory and technology have shifted toward multidimensional comprehension-based analytic techniques. In an effort to make use of these advancements and demonstrate their general utility, 6 commonly used measures of gambling disorder were submitted to readability analyses using 2 of these advanced approaches, Coh-Metrix and Question Understanding Aid (QUAID), and one traditional approach, the Flesch-Kincaid Grade Level. As hypothesized, significant variation was found across measures, with some questionnaires emerging as more appropriate than others for use in samples that may include individuals with low literacy. Recommendations are made for the use of these modern approaches to readability to inform decisions on measure selection and development. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Discourse of 'transformational leadership' in infection control.

    PubMed

    Koteyko, Nelya; Carter, Ronald

    2008-10-01

    The article explores the impact of the ;transformational leadership' style in the role of modern matron with regards to infection control practices. Policy and guidance on the modern matron role suggest that it is distinctive in its combination of management and clinical components, and in its reliance on transformational leadership. Senior nurses are therefore expected to motivate staff by creating high expectations, modelling appropriate behaviour, and providing personal attention to followers by giving respect and responsibility. In this article, we draw on policy documents and interview data to explore the potential impact of this new management style on infection control practices. Combining the techniques of discourse analysis and corpus linguistics, we identify examples where matrons appear to disassociate themselves from the role of ;an empowered manager' who has control over human and financial resources to resolve problems in infection control efficiently.

  13. Data Mining Techniques for Customer Relationship Management

    NASA Astrophysics Data System (ADS)

    Guo, Feng; Qin, Huilin

    2017-10-01

    Data mining have made customer relationship management (CRM) a new area where firms can gain a competitive advantage, and play a key role in the firms’ management decision. In this paper, we first analyze the value and application fields of data mining techniques for CRM, and further explore how data mining applied to Customer churn analysis. A new business culture is developing today. The conventional production centered and sales purposed market strategy is gradually shifting to customer centered and service purposed. Customers’ value orientation is increasingly affecting the firms’. And customer resource has become one of the most important strategic resources. Therefore, understanding customers’ needs and discriminating the most contributed customers has become the driving force of most modern business.

  14. Biomagnetic separation of Salmonella Typhimurium with high affine and specific ligand peptides isolated by phage display technique

    NASA Astrophysics Data System (ADS)

    Steingroewer, Juliane; Bley, Thomas; Bergemann, Christian; Boschke, Elke

    2007-04-01

    Analyses of food-borne pathogens are of great importance in order to minimize the health risk for customers. Thus, very sensitive and rapid detection methods are required. Current conventional culture techniques are very time consuming. Modern immunoassays and biochemical analysis also require pre-enrichment steps resulting in a turnaround time of at least 24 h. Biomagnetic separation (BMS) is a promising more rapid method. In this study we describe the isolation of high affine and specific peptides from a phage-peptide library, which combined with BMS allows the detection of Salmonella spp. with a similar sensitivity as that of immunomagnetic separation using antibodies.

  15. Tidal analysis of Met rocket wind data

    NASA Technical Reports Server (NTRS)

    Bedinger, J. F.; Constantinides, E.

    1976-01-01

    A method of analyzing Met Rocket wind data is described. Modern tidal theory and specialized analytical techniques were used to resolve specific tidal modes and prevailing components in observed wind data. A representation of the wind which is continuous in both space and time was formulated. Such a representation allows direct comparison with theory, allows the derivation of other quantities such as temperature and pressure which in turn may be compared with observed values, and allows the formation of a wind model which extends over a broader range of space and time. Significant diurnal tidal modes with wavelengths of 10 and 7 km were present in the data and were resolved by the analytical technique.

  16. Measurement of in situ sulfur isotopes by laser ablation multi-collector ICPMS: opening Pandora’s Box

    USGS Publications Warehouse

    Ridley, William I.; Pribil, Michael; Koenig, Alan E.; Slack, John F.

    2015-01-01

    Laser ablation multi-collector ICPMS is a modern tool for in situ measurement of S isotopes. Advantages of the technique are speed of analysis and relatively minor matrix effects combined with spatial resolution sufficient for many applications. The main disadvantage is a more destructive sampling mechanism relative to the ion microprobe technique. Recent advances in instrumentation allow precise measurement with spatial resolutions down to 25 microns. We describe specific examples from economic geology where increased spatial resolution has greatly expanded insights into the sources and evolution of fluids that cause mineralization and illuminated genetic relations between individual deposits in single mineral districts.

  17. Performing Quantitative Imaging Acquisition, Analysis and Visualization Using the Best of Open Source and Commercial Software Solutions.

    PubMed

    Shenoy, Shailesh M

    2016-07-01

    A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.

  18. Multivariate analysis techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bendavid, Josh; Fisher, Wade C.; Junk, Thomas R.

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually bothmore » be improved by separating signal events from background events with higher efficiency and purity.« less

  19. Modern contraceptive utilization and associated factors among married pastoralist women in Bale eco-region, Bale Zone, South East Ethiopia.

    PubMed

    Belda, Semere Sileshi; Haile, Mekonnen Tegegne; Melku, Abulie Takele; Tololu, Abdurehaman Kalu

    2017-03-14

    Women who live in remote rural areas encounter different challenges against contraception and often deny the use of modern contraceptive methods. The predictors of modern contraceptive utilization by pastoralist women in the Bale eco-region could be specific and are not well known. Therefore, this study aims to assess modern contraceptive utilization and its determinants among married pastoralist women in Bale eco-region, Oromia regional state, South East Ethiopia. A community-based cross-sectional study was conducted from 20th November 2015 to 30th February 2016. A structured questionnaire was used to interview 549 married pastoralist women who were selected by multistage sampling technique. The data were analyzed by SPSS - 21 software, multivariate logistic regression analysis was used to identify predictors of modern contraceptive use at (P-value <0.05), and odds ratios with 95% confidence interval were used to assess the strength of associations between variables. The current modern contraceptive method use by married pastoralist women was (20.8%). Among the total users, (78.1%) use the injectable method. The common reasons for non-use of modern contraceptive methods includes: religious-opposition (55.9%), desire for more children (28.3%), fear of side effects (25.5%), and husband's opposition (17.5%). Couple discussion (AOR = 4.63, 95%CI: 2.15, 9.98), perceived husband's approval (AOR = 8.00, 95% CI: 3.52, 18.19), discussion with health extension worker (AOR = 5.99, 95% CI: 1.81, 19.85), and perceived cultural acceptability (AOR = 2.10, 95% CI: 1.09, 4.03) were the independent predictors of modern contraceptive use by married pastoralist women in Bale eco-region. The study identified lower modern contraceptive method utilization by pastoralist women, and the majority of the contraceptive users rely on short- acting contraceptive methods. The uncomplimentary perceptions towards religious and cultural acceptability of modern contraceptive method were among the major reasons for lesser utilization of the methods. Family planning programs should be tailored to actively involve pastoralist women, husbands, and religious leaders in pastoralist communities.

  20. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  1. Urinary stone composition in Israel: current status and variation with age and sex--a bicenter study.

    PubMed

    Usman, Kalba D; Golan, Shay; Abdin, Tamer; Livne, Pinhas M; Pode, Dov; Duvdevani, Mordechai; Lifshitz, David

    2013-12-01

    The epidemiologic data regarding stone composition in Israel are based on anachronistic methods of stone analysis. Historically, Israel was noted for an unusually high percentage of uric acid stones. The aim of the study was to describe the current stone composition distribution in Israel, using modern techniques of urinary stone analysis. Age and sex correlations were investigated. In a bicenter study, using infrared spectroscopy and X-ray diffraction, stones from five hundred and thirty eight (538) patients were analyzed and demographic data recorded. The study cohort included 401 men (74.5%) and 137 women (25.5%) with a male to female ratio of 2.9:1 and a median age of 48 years (range 2-85 years). While calcium oxalate monohydrate was the predominant component in both sexes, it was lower in female patients (77.3% vs 65%). The rate of infection stones (struvite+carbonate apatite) was significantly higher in women (35.7% vs 10.2%). Uric acid stones were found in only 14.5% of the patients and increased with age. Conversely, the rate of calcium oxalate dihydrate decreased with age. Modern techniques of urinary stone analysis showed that the most frequent stone component in Israel is calcium oxalate monohydrate. In contrast to earlier reports and in accordance with reports from other countries, the overall frequency of uric acid is 14.5%. With age, the frequency of uric acid increases reaching 21% in persons >60 years old. A significant sex difference was noted in the distribution of calcium oxalate stones and infection stones. The classic 3:1 ratio was maintained, however.

  2. Application of contrast media in post-mortem imaging (CT and MRI).

    PubMed

    Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice

    2015-09-01

    The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.

  3. Large ensemble modeling of last deglacial retreat of the West Antarctic Ice Sheet: comparison of simple and advanced statistical techniques

    NASA Astrophysics Data System (ADS)

    Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.

    2015-11-01

    A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.

  4. A quality improvement management model for renal care.

    PubMed

    Vlchek, D L; Day, L M

    1991-04-01

    The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.

  5. [The modern approaches to organization of delivery system in Nizhniy Novgorod].

    PubMed

    Ryzhova, N K; Lazarev, V N

    2014-01-01

    The article presents data concerning reproductive demographic processes in Nizhniy Novgorod. The numbers of women of fertility age and indicator of maternity mortality were selected as objects for analysis. The structure of causes of maternal mortality is presented and on its basis the corresponding classification was developed. To prevent maternal losses the development of specialized centers was proposed and implementation of high-tech blood-preserving techniques as well. The routing and accompaniment of women being in critical ("closer to death") conditions are considered.

  6. Digital signal processing and control and estimation theory -- Points of tangency, area of intersection, and parallel directions

    NASA Technical Reports Server (NTRS)

    Willsky, A. S.

    1976-01-01

    A number of current research directions in the fields of digital signal processing and modern control and estimation theory were studied. Topics such as stability theory, linear prediction and parameter identification, system analysis and implementation, two-dimensional filtering, decentralized control and estimation, image processing, and nonlinear system theory were examined in order to uncover some of the basic similarities and differences in the goals, techniques, and philosophy of the two disciplines. An extensive bibliography is included.

  7. Application of real-time digitization techniques in beam measurement for accelerators

    NASA Astrophysics Data System (ADS)

    Zhao, Lei; Zhan, Lin-Song; Gao, Xing-Shun; Liu, Shu-Bin; An, Qi

    2016-04-01

    Beam measurement is very important for accelerators. In this paper, modern digital beam measurement techniques based on IQ (In-phase & Quadrature-phase) analysis are discussed. Based on this method and high-speed high-resolution analog-to-digital conversion, we have completed three beam measurement electronics systems designed for the China Spallation Neutron Source (CSNS), Shanghai Synchrotron Radiation Facility (SSRF), and Accelerator Driven Sub-critical system (ADS). Core techniques of hardware design and real-time system calibration are discussed, and performance test results of these three instruments are also presented. Supported by National Natural Science Foundation of China (11205153, 10875119), Knowledge Innovation Program of the Chinese Academy of Sciences (KJCX2-YW-N27), and the Fundamental Research Funds for the Central Universities (WK2030040029),and the CAS Center for Excellence in Particle Physics (CCEPP).

  8. Extending the knowledge in histochemistry and cell biology.

    PubMed

    Heupel, Wolfgang-Moritz; Drenckhahn, Detlev

    2010-01-01

    Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.

  9. Application of a system modification technique to dynamic tuning of a spinning rotor blade

    NASA Technical Reports Server (NTRS)

    Spain, C. V.

    1987-01-01

    An important consideration in the development of modern helicopters is the vibratory response of the main rotor blade. One way to minimize vibration levels is to ensure that natural frequencies of the spinning main rotor blade are well removed from integer multiples of the rotor speed. A technique for dynamically tuning a finite-element model of a rotor blade to accomplish that end is demonstrated. A brief overview is given of the general purpose finite element system known as Engineering Analysis Language (EAL) which was used in this work. A description of the EAL System Modification (SM) processor is then given along with an explanation of special algorithms developed to be used in conjunction with SM. Finally, this technique is demonstrated by dynamically tuning a model of an advanced composite rotor blade.

  10. Direct dating of human fossils.

    PubMed

    Grün, Rainer

    2006-01-01

    The methods that can be used for the direct dating of human remains comprise of radiocarbon, U-series, electron spin resonance (ESR), and amino acid racemization (AAR). This review gives an introduction to these methods in the context of dating human bones and teeth. Recent advances in ultrafiltration techniques have expanded the dating range of radiocarbon. It now seems feasible to reliably date bones up to 55,000 years. New developments in laser ablation mass spectrometry permit the in situ analysis of U-series isotopes, thus providing a rapid and virtually non-destructive dating method back to about 300,000 years. This is of particular importance when used in conjunction with non-destructive ESR analysis. New approaches in AAR analysis may lead to a renaissance of this method. The potential and present limitations of these direct dating techniques are discussed for sites relevant to the reconstruction of modern human evolution, including Florisbad, Border Cave, Tabun, Skhul, Qafzeh, Vindija, Banyoles, and Lake Mungo. (c) 2006 Wiley-Liss, Inc.

  11. Synthesis, growth, structure and nonlinear optical properties of a semiorganic 2-carboxy pyridinium dihydrogen phosphate single crystal

    NASA Astrophysics Data System (ADS)

    Nagapandiselvi, P.; Baby, C.; Gopalakrishnan, R.

    2015-09-01

    A new semiorganic compound namely, 2-carboxy pyridinium dihydrogen phosphate (2CPDP) was synthesised and grown as single crystals by slow evaporation solution growth technique. Single crystal XRD showed that 2CPDP belongs to monoclinic crystal system with space group P21/n. The molecular structure was further confirmed by modern spectroscopic techniques like FT-NMR (1H, 13C &31P), FT-IR, UV-Vis-NIR and Fluorescence. The UV-Vis-NIR analysis revealed suitability of the crystal for nonlinear optical applications. The photo active nature of the material is established from fluorescence studies. TG-DSC analysis showed that 2CPDP was thermally stable up to 170 °C. The dependence of dielectric properties on frequency and temperature were also studied. Nonlinear optical absorption determined from open aperture Z-Scan analysis by employing picosecond Nd-YAG laser, revealed that 2CPDP can serve as a promising candidate for optical limiting applications.

  12. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles.

    PubMed

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F; Perez, Danny

    2017-10-21

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  13. Cluster analysis of accelerated molecular dynamics simulations: A case study of the decahedron to icosahedron transition in Pt nanoparticles

    NASA Astrophysics Data System (ADS)

    Huang, Rao; Lo, Li-Ta; Wen, Yuhua; Voter, Arthur F.; Perez, Danny

    2017-10-01

    Modern molecular-dynamics-based techniques are extremely powerful to investigate the dynamical evolution of materials. With the increase in sophistication of the simulation techniques and the ubiquity of massively parallel computing platforms, atomistic simulations now generate very large amounts of data, which have to be carefully analyzed in order to reveal key features of the underlying trajectories, including the nature and characteristics of the relevant reaction pathways. We show that clustering algorithms, such as the Perron Cluster Cluster Analysis, can provide reduced representations that greatly facilitate the interpretation of complex trajectories. To illustrate this point, clustering tools are used to identify the key kinetic steps in complex accelerated molecular dynamics trajectories exhibiting shape fluctuations in Pt nanoclusters. This analysis provides an easily interpretable coarse representation of the reaction pathways in terms of a handful of clusters, in contrast to the raw trajectory that contains thousands of unique states and tens of thousands of transitions.

  14. Digital prototyping technique applied for redesigning plastic products

    NASA Astrophysics Data System (ADS)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  15. Nondestructive Analysis of Apollo Samples by Micro-CT and Micro-XRF Analysis: A PET Style Examination

    NASA Technical Reports Server (NTRS)

    Zeigler, Ryan A.

    2014-01-01

    An integral part of any sample return mission is the initial description and classification of returned samples by the preliminary examination team (PET). The goal of a PET is to characterize and classify the returned samples, making this information available to the general research community who can then conduct more in-depth studies on the samples. A PET strives to minimize the impact their work has on the sample suite, which often limits the PET work to largely visual measurements and observations like optical microscopy. More modern techniques can also be utilized by future PET to nondestructively characterize astromaterials in a more rigorous way. Here we present our recent analyses of Apollo samples 14321 and 14305 by micro-CT and micro-XRF (respectively), assess the potential for discovery of "new" Apollo samples for scientific study, and evaluate the usefulness of these techniques in future PET efforts.

  16. Setting a limit on anthropogenic sources of atmospheric 81Kr through Atom Trap Trace Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zappala, J. C.; Bailey, K.; Jiang, W.

    In this study, we place a 2.5% limit on the anthropogenic contribution to the modern abundance of 81Kr/Kr in the atmosphere at the 90% confidence level. Due to its simple production and transport in the terrestrial environment, 81Kr (half-life = 230,000 years) is an ideal tracer for old water and ice with mean residence times in the range of 105–106 years. In recent years, 81Kr-dating has been made available to the earth science community thanks to the development of Atom Trap Trace Analysis (ATTA), a laser-based atom counting technique. Further upgrades and improvements to the ATTA technique now allow usmore » to demonstrate 81Kr/Kr measurements with relative uncertainties of 1% and place this new limit on anthropogenic 81Kr. As a result of this limit, we have removed a potential systematic constraint for 81Kr-dating.« less

  17. Setting a limit on anthropogenic sources of atmospheric 81Kr through Atom Trap Trace Analysis

    DOE PAGES

    Zappala, J. C.; Bailey, K.; Jiang, W.; ...

    2017-02-09

    In this study, we place a 2.5% limit on the anthropogenic contribution to the modern abundance of 81Kr/Kr in the atmosphere at the 90% confidence level. Due to its simple production and transport in the terrestrial environment, 81Kr (half-life = 230,000 years) is an ideal tracer for old water and ice with mean residence times in the range of 105–106 years. In recent years, 81Kr-dating has been made available to the earth science community thanks to the development of Atom Trap Trace Analysis (ATTA), a laser-based atom counting technique. Further upgrades and improvements to the ATTA technique now allow usmore » to demonstrate 81Kr/Kr measurements with relative uncertainties of 1% and place this new limit on anthropogenic 81Kr. As a result of this limit, we have removed a potential systematic constraint for 81Kr-dating.« less

  18. A vibrational spectroscopic and principal component analysis of triarylmethane dyes by comparative laboratory and portable instrumentation

    NASA Astrophysics Data System (ADS)

    Doherty, B.; Vagnini, M.; Dufourmantelle, K.; Sgamellotti, A.; Brunetti, B.; Miliani, C.

    2014-03-01

    This contribution examines the utility of vibrational spectroscopy by bench and portable Raman/surface enhanced Raman and infrared methods for the investigation of ten early triarlymethane dye powder references and dye solutions applied on paper. The complementary information afforded by the techniques is shown to play a key role in the identification of specific spectral marker ranges to distiguish early synthetic dyes of art-historical interest through the elaboration of an in-house database of modern organic dyes. Chemometric analysis has permitted a separation of data by the discrimination of di-phenyl-naphthalenes and triphenylmethanes (di-amino and tri-amino derivatives). This work serves as a prelude to the validation of a non-invasive working method for in situ characterization of these synthetic dyes through a careful comparison of respective strengths and limitations of each portable technique.

  19. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-08

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  20. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  1. Application of modern autoradiography to nuclear forensic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc

    Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary tomore » information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20 × 40 cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50 μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ~10 μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. Here in this article we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris.« less

  2. Application of modern autoradiography to nuclear forensic analysis

    DOE PAGES

    Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc; ...

    2018-05-20

    Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary tomore » information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20 × 40 cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50 μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ~10 μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. Here in this article we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris.« less

  3. Application of modern autoradiography to nuclear forensic analysis.

    PubMed

    Parsons-Davis, Tashi; Knight, Kim; Fitzgerald, Marc; Stone, Gary; Caldeira, Lee; Ramon, Christina; Kristo, Michael

    2018-05-01

    Modern autoradiography techniques based on phosphorimaging technology using image plates (IPs) and digital scanning can identify heterogeneities in activity distributions and reveal material properties, serving to inform subsequent analyses. Here, we have adopted these advantages for applications in nuclear forensics, the technical analysis of radioactive or nuclear materials found outside of legal control to provide data related to provenance, production history, and trafficking route for the materials. IP autoradiography is a relatively simple, non-destructive method for sample characterization that records an image reflecting the relative intensity of alpha and beta emissions from a two-dimensional surface. Such data are complementary to information gathered from radiochemical characterization via bulk counting techniques, and can guide the application of other spatially resolved techniques such as scanning electron microscopy (SEM) and secondary ion mass spectrometry (SIMS). IP autoradiography can image large 2-dimenstional areas (up to 20×40cm), with relatively low detection limits for actinides and other radioactive nuclides, and sensitivity to a wide dynamic range (10 5 ) of activity density in a single image. Distributions of radioactivity in nuclear materials can be generated with a spatial resolution of approximately 50μm using IP autoradiography and digital scanning. While the finest grain silver halide films still provide the best possible resolution (down to ∼10μm), IP autoradiography has distinct practical advantages such as shorter exposure times, no chemical post-processing, reusability, rapid plate scanning, and automated image digitization. Sample preparation requirements are minimal, and the analytical method does not consume or alter the sample. These advantages make IP autoradiography ideal for routine screening of nuclear materials, and for the identification of areas of interest for subsequent micro-characterization methods. In this paper we present a summary of our setup, as modified for nuclear forensic sample analysis and related research, and provide examples of data from select samples from the nuclear fuel cycle and historical nuclear test debris. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Crossroads: Modern Interactive Intersections and Accessible Pedestrian Signals

    ERIC Educational Resources Information Center

    Barlow, Janet M.; Franck, Lukas

    2005-01-01

    This article discusses the interactive nature of modern actuated intersections and the effect of that interface on pedestrians who are visually impaired. Information is provided about accessible pedestrian signals (APS), the role of blindness professionals in APS installation decisions, and techniques for crossing streets with APS.

  5. The research and realization of digital management platform for ultra-precision optical elements within life-cycle

    NASA Astrophysics Data System (ADS)

    Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun

    2014-08-01

    In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.

  6. Pollen assemblages as paleoenvironmental proxies in the Florida Everglades

    USGS Publications Warehouse

    Willard, D.A.; Weimer, L.M.; Riegel, W.L.

    2001-01-01

    Analysis of 170 pollen assemblages from surface samples in eight vegetation types in the Florida Everglades indicates that these wetland sub-environments are distinguishable from the pollen record and that they are useful proxies for hydrologic and edaphic parameters. Vegetation types sampled include sawgrass marshes, cattail marshes, sloughs with floating aquatics, wet prairies, brackish marshes, tree islands, cypress swamps, and mangrove forests. The distribution of these vegetation types is controlled by specific environmental parameters, such as hydrologic regime, nutrient availability, disturbance level, substrate type, and salinity; ecotones between vegetation types may be sharp. Using R-mode cluster analysis of pollen data, we identified diagnostic species groupings; Q-mode cluster analysis was used to differentiate pollen signatures of each vegetation type. Cluster analysis and the modern analog technique were applied to interpret vegetational and environmental trends over the last two millennia at a site in Water Conservation Area 3A. The results show that close modern analogs exist for assemblages in the core and indicate past hydrologic changes at the site, correlated with both climatic and land-use changes. The ability to differentiate marshes with different hydrologic and edaphic requirements using the pollen record facilitates assessment of relative impacts of climatic and anthropogenic changes on this wetland ecosystem on smaller spatial and temporal scales than previously were possible. ?? 2001 Elsevier Science B.V.

  7. Brain tumor classification using AFM in combination with data mining techniques.

    PubMed

    Huml, Marlene; Silye, René; Zauner, Gerald; Hutterer, Stephan; Schilcher, Kurt

    2013-01-01

    Although classification of astrocytic tumors is standardized by the WHO grading system, which is mainly based on microscopy-derived, histomorphological features, there is great interobserver variability. The main causes are thought to be the complexity of morphological details varying from tumor to tumor and from patient to patient, variations in the technical histopathological procedures like staining protocols, and finally the individual experience of the diagnosing pathologist. Thus, to raise astrocytoma grading to a more objective standard, this paper proposes a methodology based on atomic force microscopy (AFM) derived images made from histopathological samples in combination with data mining techniques. By comparing AFM images with corresponding light microscopy images of the same area, the progressive formation of cavities due to cell necrosis was identified as a typical morphological marker for a computer-assisted analysis. Using genetic programming as a tool for feature analysis, a best model was created that achieved 94.74% classification accuracy in distinguishing grade II tumors from grade IV ones. While utilizing modern image analysis techniques, AFM may become an important tool in astrocytic tumor diagnosis. By this way patients suffering from grade II tumors are identified unambiguously, having a less risk for malignant transformation. They would benefit from early adjuvant therapies.

  8. Adjoint-Based Aerodynamic Design of Complex Aerospace Configurations

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.

    2016-01-01

    An overview of twenty years of adjoint-based aerodynamic design research at NASA Langley Research Center is presented. Adjoint-based algorithms provide a powerful tool for efficient sensitivity analysis of complex large-scale computational fluid dynamics (CFD) simulations. Unlike alternative approaches for which computational expense generally scales with the number of design parameters, adjoint techniques yield sensitivity derivatives of a simulation output with respect to all input parameters at the cost of a single additional simulation. With modern large-scale CFD applications often requiring millions of compute hours for a single analysis, the efficiency afforded by adjoint methods is critical in realizing a computationally tractable design optimization capability for such applications.

  9. [The choice of color in fixed prosthetics: what steps should be followed for a reliable outcome?].

    PubMed

    Vanheusden, Alain; Mainjot, Amélie

    2004-01-01

    The creation of a perfectly-matched esthetic fixed restoration is undeniably one of the most difficult challenges in modern dentistry. The final outcome depends on several essential steps: the use of an appropriate light source, the accurate analysis and correct evaluation of patient's teeth parameters (morphology, colour, surface texture,...), the clear and precise transmission of this data to the laboratory and the sound interpretation of it by a dental technician who masters esthetic prosthetic techniques perfectly. The purpose of this paper was to give a reproducible clinical method to the practitioner in order to achieve a reliable dental colorimetric analysis.

  10. APPLICATION OF SPATIAL INFORMATION TECHNOLOGY TO PETROLEUM RESOURCE ASSESSMENT ANALYSIS.

    USGS Publications Warehouse

    Miller, Betty M.; Domaratz, Michael A.

    1984-01-01

    Petroleum resource assessment procedures require the analysis of a large volume of spatial data. The US Geological Survey (USGS) has developed and applied spatial information handling procedures and digital cartographic techniques to a recent study involving the assessment of oil and gas resource potential for 74 million acres of designated and proposed wilderness lands in the western United States. The part of the study which dealt with the application of spatial information technology to petroleum resource assessment procedures is reviewed. A method was designed to expedite the gathering, integrating, managing, manipulating and plotting of spatial data from multiple data sources that are essential in modern resource assessment procedures.

  11. Design and Fabrication of DebriSat - A Representative LEO Satellite for Improvements to Standard Satellite Breakup Models

    NASA Technical Reports Server (NTRS)

    Clark, S.; Dietrich, A.; Fitz-Coy, N.; Weremeyer, M.; Liou, J.-C.

    2012-01-01

    This paper discusses the design and fabrication of DebriSat, a 50 kg satellite developed to be representative of a modern low Earth orbit satellite in terms of its components, materials used, and fabrication procedures. DebriSat will be the target of a future hypervelocity impact experiment to determine the physical characteristics of debris generated after an on-orbit collision of a modern LEO satellite. The major ground-based satellite impact experiment used by DoD and NASA in their development of satellite breakup models was SOCIT, conducted in 1992. The target used for that experiment was a Navy transit satellite (40 cm, 35 kg) fabricated in the 1960's. Modern satellites are very different in materials and construction techniques than those built 40 years ago. Therefore, there is a need to conduct a similar experiment using a modern target satellite to improve the fidelity of the satellite breakup models. To ensure that DebriSat is truly representative of typical LEO missions, a comprehensive study of historical LEO satellite designs and missions within the past 15 years for satellites ranging from 1 kg to 5000 kg was conducted. This study identified modern trends in hardware, material, and construction practices utilized in recent LEO missions. Although DebriSat is an engineering model, specific attention is placed on the quality, type, and quantity of the materials used in its fabrication to ensure the integrity of the outcome. With the exception of software, all other aspects of the satellite s design, fabrication, and assembly integration and testing will be as rigorous as that of an actual flight vehicle. For example, to simulate survivability of launch loads, DebriSat will be subjected to a vibration test. As well, the satellite will undergo thermal vacuum tests to verify that the components and overall systems meet typical environmental standards. Proper assembly and integration techniques will involve comprehensive joint analysis, including the precise torqueing of fasteners and thread locking. Finally, the implementation of process documentation and verification procedures is discussed to provide a comprehensive overview of the design and fabrication of this representative LEO satellite.

  12. The historic surface ozone record, 1896-1975, and its relation to modern measurements

    NASA Astrophysics Data System (ADS)

    Galbally, I. E.; Tarasick, D. W.; Stähelin, J.; Wallington, T. J.; Steinbacher, M.; Schultz, M.; Cooper, O. R.

    2017-12-01

    Tropospheric ozone is a greenhouse gas, a key component of atmospheric chemistry, and is detrimental to human health and plant productivity. The historic surface ozone record 1896-1975 has been constructed from measurements selected for (a) instrumentation whose ozone response can be traced to modern tropospheric ozone measurement standards, (b) samples taken when there is low probability of chemical interference and (c) sampling locations, heights and times when atmospheric mixing will minimise vertical gradients of ozone in the planetary boundary layer above and around the measurement location. Early measurements with the Schönbein filter paper technique cannot be related to modern methods with any degree of confidence. The potassium iodide-arsenite technique used at Montsouris for 1876-1910 is valid for measuring ozone; however, due to the presence of the interfering gases sulfur dioxide, ammonia and nitrogen oxides, the measured ozone concentrations are not representative of the regional atmosphere. The use of these data sets for trend analyses is not recommended. In total, 58 acceptable sets of measurements are currently identified, commencing in Europe in 1896, Greenland in 1932 and globally by the late 1950's. Between 1896 and 1944 there were 21 studies (median duration 5 days) with a median mole fraction of 23 nmol mol-1 (range of study averages 15-62 nmol mol-1). Between 1950 and 1975 there were 37 studies (median duration approx. 21 months) with a median mole fraction of 22 nmol mol-1 (range of study averages 13-49 nmol mol-1), all measured under conditions likely to give ozone mole fractions similar to those in the planetary boundary layer. These time series are matched with modern measurements from the Tropospheric Ozone Assessment Report (TOAR) Ozone Database and used to examine changes between the historic and modern observations. These historic ozone levels are higher than previously accepted for surface ozone in the late 19th early 20th Century. This historic surface ozone analysis provides a new test for historical reconstructions by Climate-Chemistry models.

  13. Applications of SLR

    NASA Technical Reports Server (NTRS)

    Schutz, Bob E.

    1993-01-01

    Satellite Laser Ranging (SLR) has a rich history of development which began in the 1960s with 10 meter-level first generation systems. These systems evolved with order of magnitude improvements to the systems that now produce several millimeter single shot range precisions. What began, in part, as an interesting application of the new laser technology has become an essential component of modern, precision space geodesy, which in turn enables contributions to a variety of science areas. Modern space geodesy is the beneficiary of technological developments which have enabled precision geodetic measurements. Aside from SLR and its closely related technique, Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI) has made prominent science contributions also. In recent years, the Global Positioning System (GPS) has demonstrated a rapidly growing popularity as the result of demonstrated low cost with high precision instrumentation. Other modern techniques such as DORIS have demonstrated the ability to make significant science contributions; furthermore, PRARE can be expected to contribute in its own right. An appropriate question is 'why should several techniques be financially supported'? While there are several answers, I offer the opinion that, in consideration of the broad science areas that are the benefactors of space geodesy, no single technique can meet all the requirements and/or expectations of the science areas in which space geodesy contributes or has the potential for contributing. The more well-known science areas include plate tectonics, earthquake processes, Earth rotation/orientation, gravity (static and temporal), ocean circulation, land, and ice topography, to name a few applications. It is unfortunate that the modern space geodesy techniques are often viewed as competitive, but this view is usually encouraged by funding competition, especially in an era of growing needs but diminishing budgets. The techniques are, for the most part, complementary and the ability to reduce the data to geodetic parameters from several techniques promotes confidence in the geophysical interpretations. In the following sections, the current SLR applications are reviewed in the context of the other techniques. The strengths and limitations of SLR are reviewed and speculation about the future prospects are offered.

  14. Geometric morphometrics in primatology: craniofacial variation in Homo sapiens and Pan troglodytes.

    PubMed

    Lynch, J M; Wood, C G; Luboga, S A

    1996-01-01

    Traditionally, morphometric studies have relied on statistical analysis of distances, angles or ratios to investigate morphometric variation among taxa. Recently, geometric techniques have been developed for the direct analysis of landmark data. In this paper, we offer a summary (with examples) of three of these newer techniques, namely shape coordinate, thin-plate spline and relative warp analyses. Shape coordinate analysis detected significant craniofacial variation between 4 modern human populations, with African and Australian Aboriginal specimens being relatively prognathous compared with their Eurasian counterparts. In addition, the Australian specimens exhibited greater basicranial flexion than all other samples. The observed relationships between size and craniofacial shape were weak. The decomposition of shape variation into affine and non-affine components is illustrated via a thin-plate spline analysis of Homo and Pan cranial landmarks. We note differences between Homo and Pan in the degree of prognathism and basicranial flexion and the position and orientation of the foramen magnum. We compare these results with previous studies of these features in higher primates and discuss the utility of geometric morphometrics as a tool in primatology and physical anthropology. We conclude that many studies of morphological variation, both within and between taxa, would benefit from the graphical nature of these techniques.

  15. A Simple Laser Microphone for Classroom Demonstration

    ERIC Educational Resources Information Center

    Moses, James M.; Trout, K. P.

    2006-01-01

    Communication through the modulation of electromagnetic radiation has become a foundational technique in modern technology. In this paper we discuss a modern day method of eavesdropping based upon the modulation of laser light reflected from a window pane. A simple and affordable classroom demonstration of a "laser microphone" is…

  16. Poster — Thur Eve — 74: Distributed, asynchronous, reactive dosimetric and outcomes analysis using DICOMautomaton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.

    2014-08-15

    Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less

  17. Kinematic analysis of modern dance movement “stag jump” within the context of impact loads, injury to the locomotor system and its prevention

    PubMed Central

    Gorwa, Joanna; Dworak, Lechosław B.; Michnik, Robert; Jurkojć, Jacek

    2014-01-01

    Background This paper presents a case study of kinematic analysis of the modern dance movement known as the “stag jump”. Detailed analysis of the kinematic structure of this movement as performed by the dancers, accompanied by measurements of impact forces during landing, will allow the authors to determine, in subsequent model-based research phases, the forces acting in knee joints of the lower landing limb. Material/Methods Two professional modern dancers participated in the study: a male and a female. The study consisted in recording the values of ground reaction and body motion, and then determining and analyzing kinematic parameters of performed movements. Results The results of measurement of joint angles in the landing lower limb, pelvis, and foot position in relation to the ground, as well as the level of vertical components of ground reaction, provided insight into the loading response phase of the “stag jump”. The measurements and obtained results show differences between the man and woman in ground reactions and kinematic quantities. Conclusions The results obtained during the research may be used in the development and teaching of dancing movements. Training sessions, carried out in the biomechanical laboratory, with active participation of dancing teachers, could form a basis for a prevention model of injuries and physical overloads occurring within this occupational group. Primary differences in the “stag jump” performance technique probably result from the different educational path the man and the woman went through. PMID:24971626

  18. New horizons in mouse immunoinformatics: reliable in silico prediction of mouse class I histocompatibility major complex peptide binding affinity.

    PubMed

    Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R

    2004-11-21

    Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).

  19. Modern radiation therapy for primary cutaneous lymphomas: field and dose guidelines from the International Lymphoma Radiation Oncology Group.

    PubMed

    Specht, Lena; Dabaja, Bouthaina; Illidge, Tim; Wilson, Lynn D; Hoppe, Richard T

    2015-05-01

    Primary cutaneous lymphomas are a heterogeneous group of diseases. They often remain localized, and they generally have a more indolent course and a better prognosis than lymphomas in other locations. They are highly radiosensitive, and radiation therapy is an important part of the treatment, either as the sole treatment or as part of a multimodality approach. Radiation therapy of primary cutaneous lymphomas requires the use of special techniques that form the focus of these guidelines. The International Lymphoma Radiation Oncology Group has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the International Lymphoma Radiation Oncology Group steering committee on the use of radiation therapy in primary cutaneous lymphomas in the modern era. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  20. The modern temperature-accelerated dynamics approach

    DOE PAGES

    Zamora, Richard J.; Uberuaga, Blas P.; Perez, Danny; ...

    2016-06-01

    Accelerated molecular dynamics (AMD) is a class of MD-based methods used to simulate atomistic systems in which the metastable state-to-state evolution is slow compared with thermal vibrations. Temperature-accelerated dynamics (TAD) is a particularly efficient AMD procedure in which the predicted evolution is hastened by elevating the temperature of the system and then recovering the correct state-to-state dynamics at the temperature of interest. TAD has been used to study various materials applications, often revealing surprising behavior beyond the reach of direct MD. This success has inspired several algorithmic performance enhancements, as well as the analysis of its mathematical framework. Recently, thesemore » enhancements have leveraged parallel programming techniques to enhance both the spatial and temporal scaling of the traditional approach. Here, we review the ongoing evolution of the modern TAD method and introduce the latest development: speculatively parallel TAD.« less

  1. A Comparative Analysis for Selection of Appropriate Mother Wavelet for Detection of Stationary Disturbances

    NASA Astrophysics Data System (ADS)

    Kamble, Saurabh Prakash; Thawkar, Shashank; Gaikwad, Vinayak G.; Kothari, D. P.

    2017-12-01

    Detection of disturbances is the first step of mitigation. Power electronics plays a crucial role in modern power system which makes system operation efficient but it also bring stationary disturbances in the power system and added impurities to the supply. It happens because of the non-linear loads used in modern day power system which inject disturbances like harmonic disturbances, flickers, sag etc. in power grid. These impurities can damage equipments so it is necessary to mitigate these impurities present in the supply very quickly. So, digital signal processing techniques are incorporated for detection purpose. Signal processing techniques like fast Fourier transform, short-time Fourier transform, Wavelet transform etc. are widely used for the detection of disturbances. Among all, wavelet transform is widely used because of its better detection capabilities. But, which mother wavelet has to use for detection is still a mystery. Depending upon the periodicity, the disturbances are classified as stationary and non-stationary disturbances. This paper presents the importance of selection of mother wavelet for analyzing stationary disturbances using discrete wavelet transform. Signals with stationary disturbances of various frequencies are generated using MATLAB. The analysis of these signals is done using various mother wavelets like Daubechies and bi-orthogonal wavelets and the measured root mean square value of stationary disturbance is obtained. The measured value obtained by discrete wavelet transform is compared with the exact RMS value of the frequency component and the percentage differences are presented which helps to select optimum mother wavelet.

  2. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis

    PubMed Central

    Steele, Joe; Bastola, Dhundy

    2014-01-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502

  3. Adhesion, friction, wear, and lubrication research by modern surface science techniques.

    NASA Technical Reports Server (NTRS)

    Keller, D. V., Jr.

    1972-01-01

    The field of surface science has undergone intense revitalization with the introduction of low-energy electron diffraction, Auger electron spectroscopy, ellipsometry, and other surface analytical techniques which have been sophisticated within the last decade. These developments have permitted submono- and monolayer structure analysis as well as chemical identification and quantitative analysis. The application of a number of these techniques to the solution of problems in the fields of friction, lubrication, and wear are examined in detail for the particular case of iron; and in general to illustrate how the accumulation of pure data will contribute toward the establishment of physiochemical concepts which are required to understand the mechanisms that are operational in friction systems. In the case of iron, LEED, Auger and microcontact studies have established that hydrogen and light-saturated organic vapors do not establish interfaces which prevent iron from welding, whereas oxygen and some oxygen and sulfur compounds do reduce welding as well as the coefficient of friction. Interpretation of these data suggests a mechanism of sulfur interaction in lubricating systems.

  4. Tracking flow of leukocytes in blood for drug analysis

    NASA Astrophysics Data System (ADS)

    Basharat, Arslan; Turner, Wesley; Stephens, Gillian; Badillo, Benjamin; Lumpkin, Rick; Andre, Patrick; Perera, Amitha

    2011-03-01

    Modern microscopy techniques allow imaging of circulating blood components under vascular flow conditions. The resulting video sequences provide unique insights into the behavior of blood cells within the vasculature and can be used as a method to monitor and quantitate the recruitment of inflammatory cells at sites of vascular injury/ inflammation and potentially serve as a pharmacodynamic biomarker, helping screen new therapies and individualize dose and combinations of drugs. However, manual analysis of these video sequences is intractable, requiring hours per 400 second video clip. In this paper, we present an automated technique to analyze the behavior and recruitment of human leukocytes in whole blood under physiological conditions of shear through a simple multi-channel fluorescence microscope in real-time. This technique detects and tracks the recruitment of leukocytes to a bioactive surface coated on a flow chamber. Rolling cells (cells which partially bind to the bioactive matrix) are detected counted, and have their velocity measured and graphed. The challenges here include: high cell density, appearance similarity, and low (1Hz) frame rate. Our approach performs frame differencing based motion segmentation, track initialization and online tracking of individual leukocytes.

  5. Progress of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting Radial Growth on a Rotating Disk

    NASA Technical Reports Server (NTRS)

    Clem, Michelle M.; Abdul-Aziz, Ali; Woike, Mark R.; Fralick, Gustave C.

    2015-01-01

    The modern turbine engine operates in a harsh environment at high speeds and is repeatedly exposed to combined high mechanical and thermal loads. The cumulative effects of these external forces lead to high stresses and strains on the engine components, such as the rotating turbine disks, which may eventually lead to a catastrophic failure if left undetected. The operating environment makes it difficult to use conventional strain gauges, therefore, non-contact strain measurement techniques is of interest to NASA and the turbine engine community. This presentation describes one such approach; the use of cross correlation analysis to measure strain experienced by the engine turbine disk with the goal of assessing potential faults and damage.

  6. Modern Observational Techniques for Comets

    NASA Technical Reports Server (NTRS)

    Brandt, J. C. (Editor); Greenberg, J. M. (Editor); Donn, B. (Editor); Rahe, J. (Editor)

    1981-01-01

    Techniques are discussed in the following areas: astrometry, photometry, infrared observations, radio observations, spectroscopy, imaging of coma and tail, image processing of observation. The determination of the chemical composition and physical structure of comets is highlighted.

  7. Metabolic reconstruction, constraint-based analysis and game theory to probe genome-scale metabolic networks.

    PubMed

    Ruppin, Eytan; Papin, Jason A; de Figueiredo, Luis F; Schuster, Stefan

    2010-08-01

    With the advent of modern omics technologies, it has become feasible to reconstruct (quasi-) whole-cell metabolic networks and characterize them in more and more detail. Computer simulations of the dynamic behavior of such networks are difficult due to a lack of kinetic data and to computational limitations. In contrast, network analysis based on appropriate constraints such as the steady-state condition (constraint-based analysis) is feasible and allows one to derive conclusions about the system's metabolic capabilities. Here, we review methods for the reconstruction of metabolic networks, modeling techniques such as flux balance analysis and elementary flux modes and current progress in their development and applications. Game-theoretical methods for studying metabolic networks are discussed as well. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. Design of Launch Vehicle Flight Control Systems Using Ascent Vehicle Stability Analysis Tool

    NASA Technical Reports Server (NTRS)

    Jang, Jiann-Woei; Alaniz, Abran; Hall, Robert; Bedossian, Nazareth; Hall, Charles; Jackson, Mark

    2011-01-01

    A launch vehicle represents a complicated flex-body structural environment for flight control system design. The Ascent-vehicle Stability Analysis Tool (ASAT) is developed to address the complicity in design and analysis of a launch vehicle. The design objective for the flight control system of a launch vehicle is to best follow guidance commands while robustly maintaining system stability. A constrained optimization approach takes the advantage of modern computational control techniques to simultaneously design multiple control systems in compliance with required design specs. "Tower Clearance" and "Load Relief" designs have been achieved for liftoff and max dynamic pressure flight regions, respectively, in the presence of large wind disturbances. The robustness of the flight control system designs has been verified in the frequency domain Monte Carlo analysis using ASAT.

  9. Techniques of EMG signal analysis: detection, processing, classification and applications

    PubMed Central

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  10. Adventures in Modern Time Series Analysis: From the Sun to the Crab Nebula and Beyond

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey

    2014-01-01

    With the generation of long, precise, and finely sampled time series the Age of Digital Astronomy is uncovering and elucidating energetic dynamical processes throughout the Universe. Fulfilling these opportunities requires data effective analysis techniques rapidly and automatically implementing advanced concepts. The Time Series Explorer, under development in collaboration with Tom Loredo, provides tools ranging from simple but optimal histograms to time and frequency domain analysis for arbitrary data modes with any time sampling. Much of this development owes its existence to Joe Bredekamp and the encouragement he provided over several decades. Sample results for solar chromospheric activity, gamma-ray activity in the Crab Nebula, active galactic nuclei and gamma-ray bursts will be displayed.

  11. Double pulse laser induced breakdown spectroscopy: A potential tool for the analysis of contaminants and macro/micronutrients in organic mineral fertilizers.

    PubMed

    Nicolodelli, Gustavo; Senesi, Giorgio Saverio; de Oliveira Perazzoli, Ivan Luiz; Marangoni, Bruno Spolon; De Melo Benites, Vinícius; Milori, Débora Marcondes Bastos Pereira

    2016-09-15

    Organic fertilizers are obtained from waste of plant or animal origin. One of the advantages of organic fertilizers is that, from the composting, it recycles waste-organic of urban and agriculture origin, whose disposal would cause environmental impacts. Fast and accurate analysis of both major and minor/trace elements contained in organic mineral and inorganic fertilizers of new generation have promoted the application of modern analytical techniques. In particular, laser induced breakdown spectroscopy (LIBS) is showing to be a very promising, quick and practical technique to detect and measure contaminants and nutrients in fertilizers. Although, this technique presents some limitations, such as a low sensitivity, if compared to other spectroscopic techniques, the use of double pulse (DP) LIBS is an alternative to the conventional LIBS in single pulse (SP). The macronutrients (Ca, Mg, K, P), micronutrients (Cu, Fe, Na, Mn, Zn) and contaminant (Cr) in fertilizer using LIBS in SP and DP configurations were evaluated. A comparative study for both configurations was performed using optimized key parameters for improving LIBS performance. The limit of detection (LOD) values obtained by DP LIBS increased up to seven times as compared to SP LIBS. In general, the marked improvement obtained when using DP system in the simultaneous LIBS quantitative determination for fertilizers analysis could be ascribed to the larger ablated mass of the sample. The results presented in this study show the promising potential of the DP LIBS technique for a qualitative analysis in fertilizers, without requiring sample preparation with chemical reagents. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Dynamic speckle interferometry of microscopic processes in solid state and thin biological objects

    NASA Astrophysics Data System (ADS)

    Vladimirov, A. P.

    2015-08-01

    Modernized theory of dynamic speckle interferometry is considered. It is shown that the time-average radiation intensity has the parameters characterizing the wave phase changes. It also brings forward an expression for time autocorrelation function of the radiation intensity. It is shown that with the vanishing averaging time value the formulas transform to the prior expressions. The results of experiments with high-cycle material fatigue and cell metabolism analysis conducted using the time-averaging technique are discussed. Good reproducibility of the results is demonstrated. It is specified that the upgraded technique allows analyzing accumulation of fatigue damage, detecting the crack start moment and determining its growth velocity with uninterrupted cyclic load. It is also demonstrated that in the experiments with a cell monolayer the technique allows studying metabolism change both in an individual cell and in a group of cells.

  13. Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.

    PubMed

    Borowsky, Joseph; Collins, Greg E

    2007-10-01

    The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.

  14. FEM Techniques for High Stress Detection in Accelerated Fatigue Simulation

    NASA Astrophysics Data System (ADS)

    Veltri, M.

    2016-09-01

    This work presents the theory and a numerical validation study in support to a novel method for a priori identification of fatigue critical regions, with the aim to accelerate durability design in large FEM problems. The investigation is placed in the context of modern full-body structural durability analysis, where a computationally intensive dynamic solution could be required to identify areas with potential for fatigue damage initiation. The early detection of fatigue critical areas can drive a simplification of the problem size, leading to sensible improvement in solution time and model handling while allowing processing of the critical areas in higher detail. The proposed technique is applied to a real life industrial case in a comparative assessment with established practices. Synthetic damage prediction quantification and visualization techniques allow for a quick and efficient comparison between methods, outlining potential application benefits and boundaries.

  15. Developments in flow visualization methods for flight research

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.

    1990-01-01

    With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.

  16. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  17. The 4DILAN Project (4TH Dimension in Landscape and Artifacts Analyses)

    NASA Astrophysics Data System (ADS)

    Chiabrando, F.; Naretto, M.; Sammartano, G.; Sambuelli, L.; Spanò, A.; Teppati Losè, L.

    2017-05-01

    The project is part of the wider application and subsequent spread of innovative digital technologies involving robotic systems. Modern society needs knowledge and investigation of the environment and of the related built landscape; therefore it increasingly requires new types of information. The goal can be achieved through the innovative integration of methods to set new analysis strategies for the knowledge of the built heritage and cultural landscape. The experimental cooperation between different disciplines and the related tools and techniques, which this work suggests for the analysis of the architectural heritage and the historical territory, are the following: - 3D metric survey techniques with active and passive sensors - the latter operating in both terrestrial mode and by aerial pointof view. In some circumstances, beyond the use of terrestrial LiDAR, even the newest mobile mapping system using SLAMtechnology (simultaneous localization and mapping) has been tested. - Techniques of non-destructive investigation, such as geophysical analysis of the subsoil and built structures, in particularGPR (Ground Penetrating Radar) techniques. - Historic and stratigraphic surveys carried out primarily through the study and interpretation of documentary sources,cartography and historical iconography, closely related to the existing data or latent material. The experience through the application of these techniques of investigation connected to the built spaces and to the manmade environments has been achieved with the aim of improving the ability to analyse the occurred transformations/layers over time and no longer directly readable or interpretable on manufactured evidence.

  18. Instrumentation and fusion for congenital spine deformities.

    PubMed

    Hedequist, Daniel J

    2009-08-01

    A retrospective clinical review. To review the use of modern instrumentation of the spine for congenital spinal deformities. Spinal instrumentation has evolved since the advent of the Harrington rod. There is a paucity of literature, which discusses the use of modern spinal instrumentation in congenital spine deformity cases. This review focuses on modern instrumentation techniques for congenital scoliosis and kyphosis. A systematic review was performed of the literature to discuss spinal implant use for congenital deformities. Spinal instrumentation may be safely and effectively used in cases of congenital spinal deformity. Spinal surgeons taking care of children with congenital spine deformities need to be trained in all aspects of modern spinal instrumentation.

  19. How Farmers Learn about Environmental Issues: Reflections on a Sociobiographical Approach

    ERIC Educational Resources Information Center

    Vandenabeele, Joke; Wildemeersch, Danny

    2012-01-01

    At the time of this research, protests of farmers against new environmental policy measures received much media attention. News reports suggested that farmers' organizations rejected the idea that modern farming techniques cause damage to the environment and even tried to undermine attempts to reconcile the goals of modern agriculture with…

  20. Older Learning Engagement in the Modern City

    ERIC Educational Resources Information Center

    Lido, Catherine; Osborne, Michael; Livingston, Mark; Thakuriah, Piyushimita; Sila-Nowicka, Katarzyna

    2016-01-01

    This research employs novel techniques to examine older learners' journeys, educationally and physically, in order to gain a "three-dimensional" picture of lifelong learning in the modern urban context of Glasgow. The data offers preliminary analyses of an ongoing 1,500 household survey by the Urban Big Data Centre (UBDC). A sample of…

  1. Commodification of Ghana's Volta River: An Example of Ellul's Autonomy of Technique

    ERIC Educational Resources Information Center

    Agbemabiese, Lawrence; Byrne, John

    2005-01-01

    Jacques Ellul argued that modernity's nearly exclusive reliance on science and technology to design society would threaten human freedom. Of particular concern for Ellul was the prospect of the technical milieu overwhelming culture. The commodification of the Volta River in order to modernize Ghana illustrates the Ellulian dilemma of the autonomy…

  2. Modern Methodology and Techniques Aimed at Developing the Environmentally Responsible Personality

    ERIC Educational Resources Information Center

    Ponomarenko, Yelena V.; Zholdasbekova, Bibisara A.; Balabekov, Aidarhan T.; Kenzhebekova, Rabiga I.; Yessaliyev, Aidarbek A.; Larchenkova, Liudmila A.

    2016-01-01

    The article discusses the positive impact of an environmentally responsible individual as the social unit able to live in harmony with the natural world, himself/herself and other people. The purpose of the article is to provide theoretical substantiation of modern teaching methods. The authors considered the experience of philosophy, psychology,…

  3. [Significance of bone mineral density and modern cementing technique for in vitro cement penetration in total shoulder arthroplasty].

    PubMed

    Pape, G; Raiss, P; Kleinschmidt, K; Schuld, C; Mohr, G; Loew, M; Rickert, M

    2010-12-01

    Loosening of the glenoid component is one of the major causes of failure in total shoulder arthroplasty. Possible risk factors for loosening of cemented components include an eccentric loading, poor bone quality, inadequate cementing technique and insufficient cement penetration. The application of a modern cementing technique has become an established procedure in total hip arthroplasty. The goal of modern cementing techniques in general is to improve the cement-penetration into the cancellous bone. Modern cementing techniques include the cement vacuum-mixing technique, retrograde filling of the cement under pressurisation and the use of a pulsatile lavage system. The main purpose of this study was to analyse cement penetration into the glenoid bone by using modern cement techniques and to investigate the relationship between the bone mineral density (BMD) and the cement penetration. Furthermore we measured the temperature at the glenoid surface before and after jet-lavage of different patients during total shoulder arthroplasty. It is known that the surrounding temperature of the bone has an effect on the polymerisation of the cement. Data from this experiment provide the temperature setting for the in-vitro study. The glenoid surface temperature was measured in 10 patients with a hand-held non-contact temperature measurement device. The bone mineral density was measured by DEXA. Eight paired cadaver scapulae were allocated (n = 16). Each pair comprised two scapulae from one donor (matched-pair design). Two different glenoid components were used, one with pegs and the other with a keel. The glenoids for the in-vitro study were prepared with the bone compaction technique by the same surgeon in all cases. Pulsatile lavage was used to clean the glenoid of blood and bone fragments. Low viscosity bone cement was applied retrogradely into the glenoid by using a syringe. A constant pressure was applied with a modified force sensor impactor. Micro-computed tomography scans were applied to analyse the cement penetration into the cancellous bone. The mean temperature during the in-vivo arthroplasty of the glenoid was 29.4 °C (27.2-31 °C) before and 26.2 °C (25-27.5 °C) after jet-lavage. The overall peak BMD was 0.59 (range 0.33-0.99) g/cm (2). Mean cement penetration was 107.9 (range 67.6-142.3) mm (2) in the peg group and 128.3 (range 102.6-170.8) mm (2) in the keel group. The thickness of the cement layer varied from 0 to 2.1 mm in the pegged group and from 0 to 2.4 mm in the keeled group. A strong negative correlation between BMD and mean cement penetration was found for the peg group (r (2) = -0.834; p < 0.01) and for the keel group (r (2) = -0.727; p < 0.041). Micro-CT shows an inhomogenous dispersion of the cement into the cancellous bone. Data from the in-vivo temperature measurement indicate that the temperature at the glenohumeral surface under operation differs from the body core temperature and should be considered in further in-vitro studies with human specimens. Bone mineral density is negatively correlated to cement penetration in the glenoid. The application of a modern cementing technique in the glenoid provides sufficient cementing penetration although there is an inhomogenous dispersion of the cement. The findings of this study should be considered in further discussions about cementing technique and cement penetration into the cancellous bone of the glenoid. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Cache Energy Optimization Techniques For Modern Processors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mittal, Sparsh

    2013-01-01

    Modern multicore processors are employing large last-level caches, for example Intel's E7-8800 processor uses 24MB L3 cache. Further, with each CMOS technology generation, leakage energy has been dramatically increasing and hence, leakage energy is expected to become a major source of energy dissipation, especially in last-level caches (LLCs). The conventional schemes of cache energy saving either aim at saving dynamic energy or are based on properties specific to first-level caches, and thus these schemes have limited utility for last-level caches. Further, several other techniques require offline profiling or per-application tuning and hence are not suitable for product systems. In thismore » book, we present novel cache leakage energy saving schemes for single-core and multicore systems; desktop, QoS, real-time and server systems. Also, we present cache energy saving techniques for caches designed with both conventional SRAM devices and emerging non-volatile devices such as STT-RAM (spin-torque transfer RAM). We present software-controlled, hardware-assisted techniques which use dynamic cache reconfiguration to configure the cache to the most energy efficient configuration while keeping the performance loss bounded. To profile and test a large number of potential configurations, we utilize low-overhead, micro-architecture components, which can be easily integrated into modern processor chips. We adopt a system-wide approach to save energy to ensure that cache reconfiguration does not increase energy consumption of other components of the processor. We have compared our techniques with state-of-the-art techniques and have found that our techniques outperform them in terms of energy efficiency and other relevant metrics. The techniques presented in this book have important applications in improving energy-efficiency of higher-end embedded, desktop, QoS, real-time, server processors and multitasking systems. This book is intended to be a valuable guide for both newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.« less

  5. cudaMap: a GPU accelerated program for gene expression connectivity mapping.

    PubMed

    McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong

    2013-10-11

    Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.

  6. Avoid lost discoveries, because of violations of standard assumptions, by using modern robust statistical methods.

    PubMed

    Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence

    2013-03-01

    Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. Machine learning and social network analysis applied to Alzheimer's disease biomarkers.

    PubMed

    Di Deco, Javier; González, Ana M; Díaz, Julia; Mato, Virginia; García-Frank, Daniel; Álvarez-Linera, Juan; Frank, Ana; Hernández-Tamames, Juan A

    2013-01-01

    Due to the fact that the number of deaths due Alzheimer is increasing, the scientists have a strong interest in early stage diagnostic of this disease. Alzheimer's patients show different kind of brain alterations, such as morphological, biochemical, functional, etc. Currently, using magnetic resonance imaging techniques is possible to obtain a huge amount of biomarkers; being difficult to appraise which of them can explain more properly how the pathology evolves instead of the normal ageing. Machine Learning methods facilitate an efficient analysis of complex data and can be used to discover which biomarkers are more informative. Moreover, automatic models can learn from historical data to suggest the diagnostic of new patients. Social Network Analysis (SNA) views social relationships in terms of network theory consisting of nodes and connections. The resulting graph-based structures are often very complex; there can be many kinds of connections between the nodes. SNA has emerged as a key technique in modern sociology. It has also gained a significant following in medicine, anthropology, biology, information science, etc., and has become a popular topic of speculation and study. This paper presents a review of machine learning and SNA techniques and then, a new approach to analyze the magnetic resonance imaging biomarkers with these techniques, obtaining relevant relationships that can explain the different phenotypes in dementia, in particular, different stages of Alzheimer's disease.

  8. [Attitude of pregnant women towards labour--study of forms of preparation and preferences].

    PubMed

    Kosińska, Katarzyna; Krychowska, Alina; Wielgoś, Mirosław; Myszewska, Aleksandra; Przyboś, Andrzej

    2005-12-01

    The aim of this study was to assess the knowledge of alternative delivery techniques among pregnant women and their preferences concerning the course of labour. 275 woman hospitalizated in obstetric wards in Puck and Ist Clinic in Warsaw were questionnaired in the period of July 2003 - February 2004. The mean age of women was 26 +/- 4.9. 55.7% of them were nulliparous, 44.3% multiparous. T-Student test was used for statistical analysis. The majority of questionnaired women knew alternative positions during delivery and possible analgetic techniques. 25.1% of women attended labour school. 81.2% wanted to give birth in the hospital, 10% at home and 8.8% in the delivery room. 51.1% preferred waterbirth and 22.5% obstetric chair--most of them came from the big cities, were better educated and attended labour school. Almost half of all women are in favour of epidural anaesthesia of delivery. Caesarean section on request was supported by 13.8%. For 67.4% the presence of intimates during labour was important. Labour school has a significant influence on women's knowledge and their preferences. Waterbirth and other modern delivery techniques are very popular among better educated women from big cities, while those with lower education from small cities and villages prefer "classic" labour. Therefore promotion of modern delivery methods and active participation in labour should be concentrated on these groups of women. Nowadays obstetric departments should ensure not only safety of giving birth but also complete personal comfort for pregnant women.

  9. The effects of modern cementing techniques on the longevity of total hip arthroplasty.

    PubMed

    Poss, R; Brick, G W; Wright, R J; Roberts, D W; Sledge, C B

    1988-07-01

    Modern prosthetic design and cementing techniques have dramatically improved femoral component fixation. Compared to studies reported in the 1970s, the incidence of radiographic loosening for periods up to 5 years postoperatively has been reduced by at least a factor of 10. These results are the benchmark by which alternative forms of femoral component fixation must be measured. With the likelihood of increased longevity of total hip arthroplasty resulting from improved fixation, the problems of wear debris from the bearing surfaces and loss of bone stock with time will become preeminent.

  10. [Modern biology, imagery and forensic medicine: contributions and limitations in examination of skeletal remains].

    PubMed

    Lecomte, Dominique; Plu, Isabelle; Froment, Alain

    2012-06-01

    Forensic examination is often requested when skeletal remains are discovered. Detailed visual observation can provide much information, such as the human or animal origin, sex, age, stature, and ancestry, and approximate time since death. New three-dimensional imaging techniques can provide further information (osteometry, facial reconstruction). Bone chemistry, and particularly measurement of stable or unstable carbon and nitrogen isotopes, yields information on diet and time since death, respectively. Genetic analyses of ancient DNA are also developing rapidly. Although seldom used in a judicial context, these modern anthropologic techniques are nevertheless available for the most complex cases.

  11. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    PubMed

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  12. First GIS Analysis of Modern Stone Tools Used by Wild Chimpanzees (Pan troglodytes verus) in Bossou, Guinea, West Africa

    PubMed Central

    Arroyo, Adrian; Matsuzawa, Tetsuro; de la Torre, Ignacio

    2015-01-01

    Stone tool use by wild chimpanzees of West Africa offers a unique opportunity to explore the evolutionary roots of technology during human evolution. However, detailed analyses of chimpanzee stone artifacts are still lacking, thus precluding a comparison with the earliest archaeological record. This paper presents the first systematic study of stone tools used by wild chimpanzees to crack open nuts in Bossou (Guinea-Conakry), and applies pioneering analytical techniques to such artifacts. Automatic morphometric GIS classification enabled to create maps of use wear over the stone tools (anvils, hammers, and hammers/ anvils), which were blind tested with GIS spatial analysis of damage patterns identified visually. Our analysis shows that chimpanzee stone tool use wear can be systematized and specific damage patterns discerned, allowing to discriminate between active and passive pounders in lithic assemblages. In summary, our results demonstrate the heuristic potential of combined suites of GIS techniques for the analysis of battered artifacts, and have enabled creating a referential framework of analysis in which wild chimpanzee battered tools can for the first time be directly compared to the early archaeological record. PMID:25793642

  13. Physics and engineering aspects of cell and tissue imaging systems: microscopic devices and computer assisted diagnosis.

    PubMed

    Chen, Xiaodong; Ren, Liqiang; Zheng, Bin; Liu, Hong

    2013-01-01

    The conventional optical microscopes have been used widely in scientific research and in clinical practice. The modern digital microscopic devices combine the power of optical imaging and computerized analysis, archiving and communication techniques. It has a great potential in pathological examinations for improving the efficiency and accuracy of clinical diagnosis. This chapter reviews the basic optical principles of conventional microscopes, fluorescence microscopes and electron microscopes. The recent developments and future clinical applications of advanced digital microscopic imaging methods and computer assisted diagnosis schemes are also discussed.

  14. New Applications of Portable Raman Spectroscopy in Agri-Bio-Photonics

    NASA Astrophysics Data System (ADS)

    Voronine, Dmitri; Scully, Rob; Sanders, Virgil

    2014-03-01

    Modern optical techniques based on Raman spectroscopy are being used to monitor and analyze the health of cattle, crops and their natural environment. These optical tools are now available to perform fast, noninvasive analysis of live animals and plants in situ. We will report new applications of a portable handheld Raman spectroscopy to identification and taxonomy of plants. In addition, detection of organic food residues will be demonstrated. Advantages and limitations of current portable instruments will be discussed with suggestions for improved performance by applying enhanced Raman spectroscopic schemes.

  15. ANALOG: a program for estimating paleoclimate parameters using the method of modern analogs

    USGS Publications Warehouse

    Schweitzer, Peter N.

    1994-01-01

    Beginning in the 1970s with CLIMAP, paleoclimatologists have been trying to derive quantitative estimates of climatic parameters from the sedimentary record. In general the procedure is to observe the modern distribution of some component of surface sediment that depends on climate, find an empirical relationship between climate and the character of sediments, then extrapolate past climate by studying older sediments in the same way. Initially the empirical relationship between climate and components of the sediment was determined using a multiple regression technique (Imbrie and Kipp, 1971). In these studies sea-floor sediments were examined to determine the percentage of various species of planktonic foraminifera present in them. Supposing that the distribution of foraminiferal assemblages depended strongly on the extremes of annual sea-surface temperature (SST), the foraminiferal assemblages (refined through use of varimax factor analysis) were regressed against the average SST during the coolest and warmest months of the year. The result was a set of transfer functions, equations that could be used to estimate cool and warm SST from the faunal composition of a sediment sample. Assuming that the ecological preference of the species had remained constant throughout the last several hundred thousand years, these transfer functions could be used to estimate SSTs during much of the late Pleistocene. Hutson (1980) and Overpeck, Webb, and Prentice (1985) proposed an alternative approach to estimating paleoclimatic parameters. Their 'method of modern analogs' revolved not around the existence of a few climatically-sensitive faunal assemblages but rather on the expectation that similar climatic regimes should foster similar faunal and floral assemblages. From a large pool of modern samples, those few are selected whose faunal compositions are most similar to a given fossil sample. Paleoclimate estimates are derived using the climatic character of only the most similar modern samples, the modern analogs of the fossil sample. This report describes how to use the program ANALOG to carry out the method of modern analogs. It is assumed that the user has faunal census estimates of one or more fossil samples, and one or more sets of faunal data from modern samples. Furthermore, the user must understand the taxonomic categories represented in the data sets, and be able to recognize taxa that are or may be considered equivalent in the analysis. ANALOG provides the user with flexibility in input data format, output data content, and choice of distance measure, and allows the user to determine which taxa from each modern and fossil data file are compared. Most of the memory required by the program is allocated dynamically, so that, on systems that permit program segments to grow, the program consumes only as many system resources as are needed to accomplish its task.

  16. Impact of Starting Point and Bicortical Purchase of C1 Lateral Mass Screws on Atlantoaxial Fusion: Meta-Analysis and Review of the Literature.

    PubMed

    Elliott, Robert E; Tanweer, Omar; Smith, Michael L; Frempong-Boadu, Anthony

    2015-08-01

    Structured review of literature and application of meta-analysis statistical techniques. Review published series describing clinical and radiographic outcomes of patients treated with C1 lateral mass screws (C1LMS), specifically analyzing the impact of starting point and bicortical purchase on successful atlantoaxial arthrodesis. Biomechanical studies suggest posterior arch screws and C1LMS with bicortical purchase are stronger than screws placed within the center of the lateral mass or those with unicortical purchase. Online databases were searched for English-language articles between 1994 and 2012 describing posterior atlantal instrumentation with C1LMS. Thirty-four studies describing 1247 patients having posterior atlantoaxial fusion with C1LMS met inclusion criteria. All studies provided class III evidence. Arthrodesis was quite successful regardless of technique (99.0% overall). Meta-analysis and multivariate regression analyses showed that neither posterior arch starting point nor bicortical screw purchase translated into a higher rate of successful arthrodesis. There were no complications from bicortical screw purchase. The Goel-Harms technique is a very safe and successful technique for achieving atlantoaxial fusion, regardless of minor variations in C1LMS technique. Although biomechanical studies suggest markedly increased rigidity of bicortical and posterior arch C1LMS, the significance of these findings may be minimal in the clinical setting of atlantoaxial fixation and fusion with modern techniques. The decision to use either technique must be made after careful review of the preoperative multiplanar computed tomography imaging, assessment of the unique anatomy of each patient, and the demands of the clinical scenario such as bone quality.

  17. Courses in Modern Physics for Non-science Majors, Future Science Teachers, and Biology Students

    NASA Astrophysics Data System (ADS)

    Zollman, Dean

    2001-03-01

    For the past 15 years Kansas State University has offered a course in modern physics for students who are not majoring in physics. This course carries a prerequisite of one physics course so that the students have a basic introduction in classical topics. The majors of students range from liberal arts to engineering. Future secondary science teachers whose first area of teaching is not physics can use the course as part of their study of science. The course has evolved from a lecture format to one which is highly interactive and uses a combination of hands-on activities, tutorials and visualizations, particularly the Visual Quantum Mechanics materials. Another course encourages biology students to continue their physics learning beyond the introductory course. Modern Miracle Medical Machines introduces the basic physics which underlie diagnosis techniques such as MRI and PET and laser surgical techniques. Additional information is available at http://www.phys.ksu.edu/perg/

  18. [Achievements and enlightenment of modern acupuncture therapy for stroke based on the neuroanatomy].

    PubMed

    Chen, Li-Fang; Fang, Jian-Qiao; Chen, Lu-Ni; Wang, Chao

    2014-04-01

    Up to now, in the treatment of stroke patients by acupuncture therapy, three main representative achievements involving scalp acupuncture intervention, "Xing Nao Kai Qiao" (restoring consciousness and inducing resuscitation) acupuncture technique and nape acupuncture therapy have been got. Regarding their neurobiological mechanisms, the scalp acupuncture therapy is based on the functional localization of the cerebral cortex, "Xing Nao Kai Qiao" acupuncture therapy is closely related to nerve stem stimulation, and the nape acupuncture therapy is based on the nerve innervation of the regional neck-nape area in obtaining therapeutic effects. In fact, effects of these three acupuncture interventions are all closely associated with the modern neuroanatomy. In the treatment of post-stroke spastic paralysis, cognitive disorder and depression with acupuncture therapy, modern neuroanatomical knowledge should be one of the key theoretical basis and new therapeutic techniques should be explored and developed continuously.

  19. General aviation aircraft interior noise problem: Some suggested solutions

    NASA Technical Reports Server (NTRS)

    Roskam, J.; Navaneethan, R.

    1984-01-01

    Laboratory investigation of sound transmission through panels and the use of modern data analysis techniques applied to actual aircraft is used to determine methods to reduce general aviation interior noise. The experimental noise reduction characteristics of stiffened flat and curved panels with damping treatment are discussed. The experimental results of double-wall panels used in the general aviation industry are given. The effects of skin panel material, fiberglass insulation and trim panel material on the noise reduction characteristics of double-wall panels are investigated. With few modifications, the classical sound transmission theory can be used to design the interior noise control treatment of aircraft. Acoustic intensity and analysis procedures are included.

  20. On surface analysis and archaeometallurgy

    NASA Astrophysics Data System (ADS)

    Giumlia-Mair, Alessandra

    2005-09-01

    The tasks and problems which the study of ancient artefacts involves are manifold and almost as numerous as the different classes of materials and objects studied by modern specialists. This happens especially because the conservation of artefacts depends not only on their material and manufacturing techniques, but also very much on the environment and the type of soil in which they were deposited, sometimes for millennia. A number of archaeological materials and objects, dated to different periods, from the earliest use of metals to historical times, and also coming from very different geographical contexts in the whole Ancient World are discussed in this paper. The most common surface analysis methods will be evaluated and discussed in the framework of different applications.

  1. Rich Language Analysis for Counterterrorism

    NASA Astrophysics Data System (ADS)

    Guidère, Mathieu; Howard, Newton; Argamon, Shlomo

    Accurate and relevant intelligence is critical for effective counterterrorism. Too much irrelevant information is as bad or worse than not enough information. Modern computational tools promise to provide better search and summarization capabilities to help analysts filter and select relevant and key information. However, to do this task effectively, such tools must have access to levels of meaning beyond the literal. Terrorists operating in context-rich cultures like fundamentalist Islam use messages with multiple levels of interpretation, which are easily misunderstood by non-insiders. This chapter discusses several kinds of such “encryption” used by terrorists and insurgents in the Arabic language, and how knowledge of such methods can be used to enhance computational text analysis techniques for use in counterterrorism.

  2. Modernization of the Transonic Axial Compressor Test Rig

    DTIC Science & Technology

    2017-12-01

    13. ABSTRACT (maximum 200 words) This work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic...fabricate the materials. Stiffness tests and modal analysis were conducted via Finite Element Analysis (FEA) software. This analysis was used to design ...work presents the design and simulation process of modernizing the Naval Postgraduate School’s transonic compressor test rig (TCR). The TCR, which

  3. Detection and monitoring of cardiotoxicity-what does modern cardiology offer?

    PubMed

    Jurcut, Ruxandra; Wildiers, Hans; Ganame, Javier; D'hooge, Jan; Paridaens, Robert; Voigt, Jens-Uwe

    2008-05-01

    With new anticancer therapies, many patients can have a long life expectancy. Treatment-related comorbidities become an issue for cancer survivors. Cardiac toxicity remains an important side effect of anticancer therapies. Myocardial dysfunction can become apparent early or long after end of therapy and may be irreversible. Detection of cardiac injury is crucial since it may facilitate early therapeutic measures. Traditionally, chemotherapy-induced cardiotoxicity has been detected by measuring changes in left ventricular ejection fraction. This parameter is, however, insensitive to subtle changes in myocardial function as they occur in early cardiotoxicity. This review will discuss conventional and modern cardiologic approaches of assessing myocardial function. It will focus on Doppler myocardial imaging, a method which allows to sensitively measure myocardial function parameters like myocardial velocity, deformation (strain), or deformation rate (strain rate) and which has been shown to reliably detect early abnormalities in both regional and global myocardial function in an early stage. Other newer echocardiographic function estimators are based on automated border detection algorithms and ultrasonic integrated backscatter analysis. A further technique to be discussed is dobutamine stress echocardiography. The use of new biomarkers like B-type natriuretic peptide and troponin and less often used imaging techniques like magnetic resonance imaging and computed tomography will also be mentioned.

  4. Modern developments for ground-based monitoring of fire behavior and effects

    Treesearch

    Colin C. Hardy; Robert Kremens; Matthew B. Dickinson

    2010-01-01

    Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...

  5. Developing the professional competence of future doctors in the instructional setting of higher medical educational institutions.

    PubMed

    Morokhovets, Halyna Yu; Lysanets, Yuliia V

    The main objectives of higher medical education is the continuous professional improvement of physicians to meet the needs dictated by the modern world both at undergraduate and postgraduate levels. In this respect, the system of higher medical education has undergone certain changes - from determining the range of professional competences to the adoption of new standards of education in medicine. The article aims to analyze the parameters of doctor's professionalism in the context of competence-based approach and to develop practical recommendations for the improvement of instruction techniques. The authors reviewed the psycho-pedagogical materials and summarized the acquired experience of teachers at higher medical institutions as to the development of instruction techniques in the modern educational process. The study is based on the results of testing via the technique developed by T.I. Ilyina. Analytical and biblio-semantic methods were used in the paper. It has been found that the training process at medical educational institution should be focused on the learning outcomes. The authors defined the quality parameters of doctors' training and suggested the model for developing the professional competence of medical students. This model explains the cause-and-effect relationships between the forms of instruction, teaching techniques and specific components of professional competence in future doctors. The paper provides practical recommendations on developing the core competencies which a qualified doctor should master. The analysis of existing interactive media in Ukraine and abroad has been performed. It has been found that teaching the core disciplines with the use of latest technologies and interactive means keeps abreast of the times, while teaching social studies and humanities to medical students still involves certain difficulties.

  6. Conservation and Preservation of Archives.

    ERIC Educational Resources Information Center

    Kathpalia, Y. P.

    1982-01-01

    Presents concept of preventive conservation of archival records as a new science resulting from the use of modern techniques and chemicals. Various techniques for storage, proper environment, preventive de-acidification, fire prevention, restoration, and staff considerations are described. References are provided. (EJS)

  7. The modernization of cooking techniques in two rural Mayan communities of Yucatán: the case of lard frying.

    PubMed

    Arroyo, Pedro; Pardío-López, Jeanette; Loria, Alvar; Fernández-García, Victoria

    2010-01-01

    The objective of this article is to provide information on cooking techniques used by two rural communities of Yucatán. We used a 24-hour recall method with 275 participants consuming 763 dishes. Dishes were classified according to cooking technique: 205 were lard-fried (27%), 169 oil-fried (22%), and 389 boiled/grilled (51%). The smaller more secluded community (San Rafael) consumed more fried dishes than the larger community (Uci) (54% versus 45%) and used more lard-frying than Uci (65% versus 46%). The more extensive use of lard in the smaller community appears to be due to fewer modernizing influences such as the availability and use of industrialized vegetable oils. Copyright © Taylor & Francis Group, LLC

  8. Timing analysis by model checking

    NASA Technical Reports Server (NTRS)

    Naydich, Dimitri; Guaspari, David

    2000-01-01

    The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.

  9. Wavelets in music analysis and synthesis: timbre analysis and perspectives

    NASA Astrophysics Data System (ADS)

    Alves Faria, Regis R.; Ruschioni, Ruggero A.; Zuffo, Joao A.

    1996-10-01

    Music is a vital element in the process of comprehending the world where we live and interact with. Frequency it exerts a subtle but expressive influence over a society's evolution line. Analysis and synthesis of music and musical instruments has always been associated with forefront technologies available at each period of human history, and there is no surprise in witnessing now the use of digital technologies and sophisticated mathematical tools supporting its development. Fourier techniques have been employed for years as a tool to analyze timbres' spectral characteristics, and re-synthesize them from these extracted parameters. Recently many modern implementations, based on spectral modeling techniques, have been leading to the development of new generations of music synthesizers, capable of reproducing natural sounds with high fidelity, and producing novel timbres as well. Wavelets are a promising tool on the development of new generations of music synthesizers, counting on its advantages over the Fourier techniques in representing non-periodic and transient signals, with complex fine textures, as found in music. In this paper we propose and introduce the use of wavelets addressing its perspectives towards musical applications. The central idea is to investigate the capacities of wavelets in analyzing, extracting features and altering fine timbre components in a multiresolution time- scale, so as to produce high quality synthesized musical sounds.

  10. Modern analytical methods for the detection of food fraud and adulteration by food category.

    PubMed

    Hong, Eunyoung; Lee, Sang Yoo; Jeong, Jae Yun; Park, Jung Min; Kim, Byung Hee; Kwon, Kisung; Chun, Hyang Sook

    2017-09-01

    This review provides current information on the analytical methods used to identify food adulteration in the six most adulterated food categories: animal origin and seafood, oils and fats, beverages, spices and sweet foods (e.g. honey), grain-based food, and others (organic food and dietary supplements). The analytical techniques (both conventional and emerging) used to identify adulteration in these six food categories involve sensory, physicochemical, DNA-based, chromatographic and spectroscopic methods, and have been combined with chemometrics, making these techniques more convenient and effective for the analysis of a broad variety of food products. Despite recent advances, the need remains for suitably sensitive and widely applicable methodologies that encompass all the various aspects of food adulteration. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  11. Lesion Detection in CT Images Using Deep Learning Semantic Segmentation Technique

    NASA Astrophysics Data System (ADS)

    Kalinovsky, A.; Liauchuk, V.; Tarasau, A.

    2017-05-01

    In this paper, the problem of automatic detection of tuberculosis lesion on 3D lung CT images is considered as a benchmark for testing out algorithms based on a modern concept of Deep Learning. For training and testing of the algorithms a domestic dataset of 338 3D CT scans of tuberculosis patients with manually labelled lesions was used. The algorithms which are based on using Deep Convolutional Networks were implemented and applied in three different ways including slice-wise lesion detection in 2D images using semantic segmentation, slice-wise lesion detection in 2D images using sliding window technique as well as straightforward detection of lesions via semantic segmentation in whole 3D CT scans. The algorithms demonstrate superior performance compared to algorithms based on conventional image analysis methods.

  12. Modern Techniques in Acoustical Signal and Image Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Candy, J V

    2002-04-04

    Acoustical signal processing problems can lead to some complex and intricate techniques to extract the desired information from noisy, sometimes inadequate, measurements. The challenge is to formulate a meaningful strategy that is aimed at performing the processing required even in the face of uncertainties. This strategy can be as simple as a transformation of the measured data to another domain for analysis or as complex as embedding a full-scale propagation model into the processor. The aims of both approaches are the same--to extract the desired information and reject the extraneous, that is, develop a signal processing scheme to achieve thismore » goal. In this paper, we briefly discuss this underlying philosophy from a ''bottom-up'' approach enabling the problem to dictate the solution rather than visa-versa.« less

  13. Biosignals Analysis for Kidney Function Effect Analysis of Fennel Aromatherapy

    PubMed Central

    Kim, Bong-Hyun; Cho, Dong-Uk; Seo, Ssang-Hee

    2015-01-01

    Human effort in order to enjoy a healthy life is diverse. IT technology to these analyzes, the results of development efforts, it has been applied. Therefore, I use the care and maintenance diagnostic health management and prevention than treatment. In particular, the aromatherapy treatment easy to use without the side effects there is no irritation, are widely used in modern society. In this paper, we measured the aroma effect by applying a biosignal analysis techniques; an experiment was performed to analyze. In particular, we design methods and processes of research based on the theory aroma that affect renal function. Therefore, in this paper, measuring the biosignals and after fennel aromatherapy treatment prior to the enforcement of the mutual comparison, through the analysis, studies were carried out to analyze the effect of fennel aromatherapy therapy on kidney function. PMID:25977696

  14. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Astrophysics Data System (ADS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; McCleary, S. L.

    1991-05-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  15. An accurate nonlinear finite element analysis and test correlation of a stiffened composite wing panel

    NASA Technical Reports Server (NTRS)

    Davis, D. D., Jr.; Krishnamurthy, T.; Stroud, W. J.; Mccleary, S. L.

    1991-01-01

    State-of-the-art nonlinear finite element analysis techniques are evaluated by applying them to a realistic aircraft structural component. A wing panel from the V-22 tiltrotor aircraft is chosen because it is a typical modern aircraft structural component for which there is experimental data for comparison of results. From blueprints and drawings, a very detailed finite element model containing 2284 9-node Assumed Natural-Coordinate Strain elements was generated. A novel solution strategy which accounts for geometric nonlinearity through the use of corotating element reference frames and nonlinear strain-displacement relations is used to analyze this detailed model. Results from linear analyses using the same finite element model are presented in order to illustrate the advantages and costs of the nonlinear analysis as compared with the more traditional linear analysis.

  16. Applications of Mass Spectrometry Imaging for Safety Evaluation.

    PubMed

    Bonnel, David; Stauber, Jonathan

    2017-01-01

    Mass spectrometry imaging (MSI) was first derived from techniques used in physics, which were then incorporated into chemistry followed by application in biology. Developed over 50 years ago, and with different principles to detect and map compounds on a sample surface, MSI supports modern biology questions by detecting biological compounds within tissue sections. MALDI (matrix-assisted laser desorption/ionization) imaging trend analysis in this field shows an important increase in the number of publications since 2005, especially with the development of the MALDI imaging technique and its applications in biomarker discovery and drug distribution. With recent improvements of statistical tools, absolute and relative quantification protocols, as well as quality and reproducibility evaluations, MALDI imaging has become one of the most reliable MSI techniques to support drug discovery and development phases. MSI allows to potentially address important questions in drug development such as "What is the localization of the drug and its metabolites in the tissues?", "What is the pharmacological effect of the drug in this particular region of interest?", or "Is the drug and its metabolites related to an atypical finding?" However, prior to addressing these questions using MSI techniques, expertise needs to be developed to become proficient at histological procedures (tissue preparation with frozen of fixed tissues), analytical chemistry, matrix application, instrumentation, informatics, and mathematics for data analysis and interpretation.

  17. NMR of thin layers using a meanderline surface coil

    DOEpatents

    Cowgill, Donald F.

    2001-01-01

    A miniature meanderline sensor coil which extends the capabilities of nuclear magnetic resonance (NMR) to provide analysis of thin planar samples and surface layer geometries. The sensor coil allows standard NMR techniques to be used to examine thin planar (or curved) layers, extending NMRs utility to many problems of modern interest. This technique can be used to examine contact layers, non-destructively depth profile into films, or image multiple layers in a 3-dimensional sense. It lends itself to high resolution NMR techniques of magic angle spinning and thus can be used to examine the bonding and electronic structure in layered materials or to observe the chemistry associated with aging coatings. Coupling this sensor coil technology with an arrangement of small magnets will produce a penetrator probe for remote in-situ chemical analysis of groundwater or contaminant sediments. Alternatively, the sensor coil can be further miniaturized to provide sub-micron depth resolution within thin films or to orthoscopically examine living tissue. This thin-layer NMR technique using a stationary meanderline coil in a series-resonant circuit has been demonstrated and it has been determined that the flat meanderline geometry has about he same detection sensitivity as a solenoidal coil, but is specifically tailored to examine planar material layers, while avoiding signals from the bulk.

  18. Grass material as process standard for compound-specific radiocarbon analysis

    NASA Astrophysics Data System (ADS)

    Cisneros-Dozal, Malu; Xu, Xiaomei; Bryant, Charlotte; Pearson, Emma; Dungait, Jennifer

    2015-04-01

    Compound-specific radiocarbon analysis (CSRA) is a powerful tool to study the carbon cycle and/or as a dating technique in paleoclimate reconstructions. The radiocarbon value of individual compounds can provide insight into turnover times, organic matter sources and in specific cases can be used to establish chronologies when traditional dating materials (e.g. macrofossils, pollen, charcoal) are not available. The isolation of compounds (or group of compounds) from parent material (e.g. soil, plant) for radiocarbon analysis can, however, introduce carbon contamination through chemical separation steps and preparative capillary gas chromatography (PCGC). In addition, the compounds of interest are often in low abundance which amplifies the contamination effect. The extraneous carbon can be of modern 14C age and/or 14C -free and its amount and 14C value must be determined for a given system/isolation procedure in order to report accurate 14C values. This can be achieved by using adequate standard materials but, by contrast with traditional radiocarbon dating, there are not established reference standards for CSRA work, in part because the type of standard material depends on the compounds of interest and the isolation procedure. Here we evaluate the use of n-alkanes extracted from single-year growth grass as modern process standard material for CSRA using PCGC isolation. The grass material has a known 14C value of 1.224 ± 0.006 fraction modern (FM) and the individual n-alkanes are expected to have a similar 14C value. In order to correct for the addition of extraneous carbon during PCGC isolation of the n-alkanes, we used commercially available compounds of modern 14C content and 14C -free (adipic acid, FM= 0.0015 ± 0.0001 and docosane, FM=1.059 ± 0.003) to evaluate our PCGC procedure. The corrected 14C values of the isolated n-alkanes extracted from the modern grass are within one sigma of the grass bulk 14C value for n-C29 and within two sigma for n-C23-C27, C31 and C33. Our results show that single-year growth grass can be a process standard suitable for quality control of extraction of n-alkanes (and potentially other compounds) from soil or plant material for CSRA.

  19. Recent advances in liquid-phase separations for clinical metabolomics.

    PubMed

    Kohler, Isabelle; Giera, Martin

    2017-01-01

    Over the last decades, several technological improvements have been achieved in liquid-based separation techniques, notably, with the advent of fully porous sub-2 μm particles and superficially porous sub-3 μm particles, the comeback of supercritical fluid chromatography, and the development of alternative chromatographic modes such as hydrophilic interaction chromatography. Combined with mass spectrometry, these techniques have demonstrated their added value, substantially increasing separation efficiency, selectivity, and speed of analysis. These benefits are essential in modern clinical metabolomics typically involving the study of large-scale sample cohorts and the analysis of thousands of metabolites showing extensive differences in physicochemical properties. This review presents a brief overview of the recent developments in liquid-phase separation sciences in the context of clinical metabolomics, focusing on increased throughput as well as metabolite coverage. Relevant metabolomics applications highlighting the benefits of ultra-high performance liquid chromatography, core-shell technology, high-temperature liquid chromatography, capillary electrophoresis, supercritical fluid chromatography, and hydrophilic interaction chromatography are discussed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. [Millenniums update on posterior capsule opacification (PCO) scores, centration, biocompatibility and fixation of foldable intraocular lenses (IOL) - an analysis of 1,221 pseudophakic post mortem globes].

    PubMed

    Schmidbauer, J M; Vargas, L G; Apple, D J; Auffarth, G U; Peng, Q; Arthur, S N; Escobar-Gomez, M

    2001-10-01

    The ongoing and fast evolution of foldable IOL designs established the necessity to evaluate the different abilities of each lens style. The large IOL database (over 16,500 specimens) acquired in our laboratory, has permitted us to perform a clinico-pathological analysis on pseudophakic autopsy globes provided from sources worldwide, especially many Lions Eye banks in the United States. We analyzed 6 foldable IOL styles commonly implanted in the United States, using one type of rigid IOL design (1-piece design rigid PMMA optics) as a comparison group. Posterior capsule opacification (PCO) score, decentration, fixation, presence or absence of a Nd:YAG laser posterior capsulotomy and area and intensity of Soemmerring's ring formation was discerned by examination under an operating microscope using the Miyake-Apple posterior photographic technique. The four lenses with the lowest rates ranging between 3.8 % and 21.7 % are modern designs, mostly implanted after 1992. The two lenses with the higher rates ranging between 23.1 % and 30.4 %, were all older designs, already prevalent prior to 1992. IOL fixation with both haptics in the capsular bag showed the best centration values and PCO scores. Our studies to date have shown in a preliminary fashion that the AcrySoftrade mark IOL displays the the lowest (best) biocompatibility score. Entering the new millenium, with evolution of modern surgical techniques and IOL designs, the incidence of the two major complications of cataract surgery, decentration and PCO are now finally diminishing.

  1. Lama guanicoe remains from the Chaco ecoregion (Córdoba, Argentina): An osteological approach to the characterization of a relict wild population

    PubMed Central

    Barri, Fernando

    2018-01-01

    Guanacos (Lama guanicoe) are large ungulates that have been valued by human populations in South America since the Late Pleistocene. Even though they were very abundant until the end of the 19th century (before the high deforestation rate of the last decades), guanacos have nearly disappeared in the Gran Chaco ecoregion, with relicts and isolated populations surviving in some areas, such as the shrubland area near the saline depressions of Córdoba province, Argentina. In this report, we present the first data from a locally endangered guanaco wild population, through the study of skeletal remains recovered in La Providencia ranch. Our results showed that most of the elements belonged to adults aged between 36 and 96 months; sex evaluation showed similar numbers of males and females. Statistical analysis of the body size of modern samples from Córdoba demonstrated that guanacos from the Chaco had large dimensions and presented lower size variability than the modern and archaeological specimens in our database. Moreover, they exhibited dimensions similar to those of modern guanacos from Patagonia and San Juan, and to archaeological specimens from Ongamira and Cerro Colorado, although further genetic studies are needed to corroborate a possible phylogenetic relationship. Finally, we used archaeozoological techniques to provide a first characterization of a relict guanaco population from the Chaco ecoregion, demonstrating its value to the study of modern skeletal remains and species conservation biology. PMID:29641579

  2. Application of meandering centreline migration modelling and object-based approach of Long Nab member

    NASA Astrophysics Data System (ADS)

    Saadi, Saad

    2017-04-01

    Characterizing the complexity and heterogeneity of the geometries and deposits in meandering river system is an important concern for the reservoir modelling of fluvial environments. Re-examination of the Long Nab member in the Scalby formation of the Ravenscar Group (Yorkshire, UK), integrating digital outcrop data and forward modelling approaches, will lead to a geologically realistic numerical model of the meandering river geometry. The methodology is based on extracting geostatistics from modern analogous, meandering rivers that exemplify both the confined and non-confined meandering point bars deposits and morphodynamics of Long Nab member. The parameters derived from the modern systems (i.e. channel width, amplitude, radius of curvature, sinuosity, wavelength, channel length and migration rate) are used as a statistical control for the forward simulation and resulting object oriented channel models. The statistical data derived from the modern analogues is multi-dimensional in nature, making analysis difficult. We apply data mining techniques such as parallel coordinates to investigate and identify the important relationships within the modern analogue data, which can then be used drive the development of, and as input to the forward model. This work will increase our understanding of meandering river morphodynamics, planform architecture and stratigraphic signature of various fluvial deposits and features. We will then use these forward modelling based channel objects to build reservoir models, and compare the behaviour of the forward modelled channels with traditional object modelling in hydrocarbon flow simulations.

  3. An integrated study of earth resources in the state of California using remote sensing techniques

    NASA Technical Reports Server (NTRS)

    1973-01-01

    University of California investigations to determine the usefulness of modern remote sensing techniques have concentrated on the water resources of the state. The studies consider in detail the supply, demand, and impact relationships.

  4. Ports modernization and its influence on trade unions.

    PubMed

    Maciel, Regina Heloisa; Lopes, Taise Araújo; Gonçalves, Rosemary Cavalcante

    2012-01-01

    The restructuring of production resulting from the Port Modernization Law (Law 8.630/90) caused significant changes in work organization of Brazilian Ports. In the case of Mucuripe (Fortaleza, Ceará), in particular, the changes were very intense as Mucuripe is an old port that, before the Law, had labor regulation being governed by Trade Unions. This paper aims to present the perceptions of Union Representatives on the changes brought about by the Law on work organization in the port of Fortaleza, its influence in the organization and in the way the Unions deal with this new reality. Open and exploratory interviews were conducted with representatives of occasional labor workers registered in the Port of Fortaleza OGMO (Orgão Gestor de Mão de Obra, Labor Regulation Management). The analysis of the collected material in the interviews was based on the technique of content analysis proposed by Bardin (1979). Trade Unions have undergone a great loss of power and it has reflected in a relative inability to perform its function and to fight for the rights of the workers. The obvious Trade Unions weakness - a reduction of strikes and less unionized workers - reflects the dominating ideology of capital.

  5. PIGE-PIXE analysis of chewing sticks of pharmacological importance

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Makanju, O. V.; Haque, A. M. I.; Buoso, M. C.; Ceccato, D.; Cherubini, R.; Moschini, G.

    1996-06-01

    PIGE and PIXE techniques were employed for the determination of the major, minor and trace elemental concentrations in chewing sticks of pharmacological importance namely: Butyrospermum paradoxum, Garcinia kola, Distemonanthus benthamianus, Bridelia ferruginea, Anogeissus leiocarpus, Terminalia glaucescens and Fagara rubescens, respectively. The concentration of fluorine which is very important for human dental enamel was specially determined using the 19F(p, p'γ) 19F reaction. For decades these chewing sticks when used alone without toothpastes have proven to be very efficient, effective and reliable in cleaning the teeth of many people particularly in Nigeria and some other countries in Africa. The teeth of users are usually very strong, clean, fresh and devoid of germs and caries. Even with the advent of modern toothpastes with special additions of fluorine, the use of these popular and efficient chewing sticks is still unabated. Many people including the elite use them solely, a few others combine their use with modern toothpastes and brush. Proton beams produced by the 7 MV CN and 2.5 MV AN 2000 Van de Graaff accelerators at INFN, Laboratori Nazionali di Legnaro (LNL), Padova, Italy were used for the PIGE and PIXE analysis, respectively. Results of this novel study are presented and discussed.

  6. A novel method for tracing the movement of multiple individual soil particles under rainfall conditions using florescent videography.

    NASA Astrophysics Data System (ADS)

    Hardy, Robert; Pates, Jackie; Quinton, John

    2016-04-01

    The importance of developing new techniques to study soil movement cannot be underestimated especially those that integrate new technology. Currently there are limited empirical data available about the movement of individual soil particles, particularly high quality time-resolved data. Here we present a new technique which allows multiple individual soil particles to be traced in real time under simulated rainfall conditions. The technique utilises fluorescent videography in combination with a fluorescent soil tracer, which is based on natural particles. The system has been successfully used on particles greater than ~130 micrometres diameter. The technique uses HD video shot at 50 frames per second, providing extremely high temporal (0.02 s) and spatial resolution (sub-millimetre) of a particle's location without the need to perturb the system. Once the tracer has been filmed then the images are processed and analysed using a particle analysis and visualisation toolkit written in python. The toolkit enables the creation of 2 and 3-D time-resolved graphs showing the location of 1 or more particles. Quantitative numerical analysis of a pathway (or collection of pathways) is also possible, allowing parameters such as particle speed and displacement to be assessed. Filming the particles removes the need to destructively sample material and has many side-benefits, reducing the time, money and effort expended in the collection, transport and laboratory analysis of soils, while delivering data in a digital form which is perfect for modern computer-driven analysis techniques. There are many potential applications for the technique. High resolution empirical data on how soil particles move could be used to create, parameterise and evaluate soil movement models, particularly those that use the movement of individual particles. As data can be collected while rainfall is occurring it may offer the ability to study systems under dynamic conditions(rather than rainfall of a constant intensity), which are more realistic and this was one motivations behind the development of this technique.

  7. Anger and its control in Graeco-Roman and modern psychology.

    PubMed

    Schimmel, S

    1979-11-01

    Modern psychologists have studied the phenomena of anger and hostility with diverse methodologies and from a variety of theoretical orientations. The close relationships between anger and aggression, psychosomatic disorder and personal unhappiness, make the understanding and control of anger an important individual and social goal. For all of its sophistication and accomplishment, however, most of the modern research demonstrates, to its disadvantage, a lack of historical perspective with respect to the analysis and treatment of anger, whether normal or pathological. This attitude has deprived psychology of a rich source of empirical observations, intriguing, testable hypotheses, and ingenious techniques of treatment. Of the literature that has been neglected, the analyses of the emotion of anger in the writings of Greek and Roman moral philosophers, particularly Aristotle (4th century B.C.), Seneca (1st century A.D.) and Plutarch (early 2nd century A.D.) are of particular interest. Although modern analyses and methods of treatment are in some ways more refined and more quantitatively precise, and are often subjected to validation and modification by empirical-experimental tests, scientific psychology has, to date, contributed relatively little to the understanding and control of anger that is novel except for research on its physiological dimensions. We can still benefit from the insight, prescriptions and procedures of the classicists, who in some respects offer more powerful methods of control than the most recently published works. Naturally, the modern psychotherapist or behavior therapist can and must go beyond the ancients, as is inherent in all scientific and intellectual progress, but there are no scientific or rational grounds for ignoring them as has been done for 75 years.

  8. Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.

    PubMed

    Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy

    2014-11-01

    Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  9. The iLappSurgery taTME app: a modern adjunct to the teaching of surgical techniques.

    PubMed

    Atallah, S; Brady, R R W

    2016-09-01

    Application-based technology has emerged as a method of modern information communication, and this has been applied towards surgical training and education. It allows surgeons the ability to obtain portable and instant access to information that is otherwise difficult to deliver. The iLappSurgery Foundation has recently launched the transanal total mesorectal excision educational application (taTME app) which provides a useful adjunct, especially for surgeons interested in mastery of the taTME technique and its principles. The article provides a detailed review of the application, which has achieved a large user-base since its debut in June, 2016.

  10. Advances in Patellofemoral Arthroplasty.

    PubMed

    Strickland, Sabrina M; Bird, Mackenzie L; Christ, Alexander B

    2018-06-01

    To describe current indications, implants, economic benefits, comparison to TKA, and functional and patient-reported outcomes of patellofemoral arthroplasty. Modern onlay implants and improved patient selection have allowed for recent improvements in short- and long-term outcomes after patellofemoral joint replacement surgery. Patellofemoral arthroplasty has become an increasingly utilized technique for the successful treatment of isolated patellofemoral arthritis. Advances in patient selection, implant design, and surgical technique have resulted in improved performance and longevity of these implants. Although short- and mid-term data for modern patellofemoral arthroplasties appear promising, further long-term clinical studies are needed to evaluate how new designs and technologies will affect patient outcomes and long-term implant performance.

  11. Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.

    PubMed

    David, Michael D

    2017-04-01

    The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  12. SIMMAX: A modern analog technique to deduce Atlantic sea surface temperatures from planktonic foraminifera in deep-sea sediments

    NASA Astrophysics Data System (ADS)

    Pflaumann, Uwe; Duprat, Josette; Pujol, Claude; Labeyrie, Laurent D.

    1996-02-01

    We present a data set of 738 planktonic foraminiferal species counts from sediment surface samples of the eastern North Atlantic and the South Atlantic between 87°N and 40°S, 35°E and 60°W including published Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) data. These species counts are linked to Levitus's [1982] modern water temperature data for the four caloric seasons, four depth ranges (0, 30, 50, and 75 m), and the combined means of those depth ranges. The relation between planktonic foraminiferal assemblages and sea surface temperature (SST) data is estimated using the newly developed SIMMAX technique, which is an acronym for a modern analog technique (MAT) with a similarity index, based on (1) the scalar product of the normalized faunal percentages and (2) a weighting procedure of the modern analog's SSTs according to the inverse geographical distances of the most similar samples. Compared to the classical CLIMAP transfer technique and conventional MAT techniques, SIMMAX provides a more confident reconstruction of paleo-SSTs (correlation coefficient is 0.994 for the caloric winter and 0.993 for caloric summer). The standard deviation of the residuals is 0.90°C for caloric winter and 0.96°C for caloric summer at 0-m water depth. The SST estimates reach optimum stability (standard deviation of the residuals is 0.88°C) at the average 0- to 75-m water depth. Our extensive database provides SST estimates over a range of -1.4 to 27.2°C for caloric winter and 0.4 to 28.6°C for caloric summer, allowing SST estimates which are especially valuable for the high-latitude Atlantic during glacial times. An electronic supplement of this material may be obtained on adiskette or Anonymous FTP from KOSMOS.AGU.ORG. (LOGIN toAGU's FTP account using ANONYMOUS as the username and GUESTas the password. Go to the right directory by typing CD APPEND. TypeLS to see what files are available. Type GET and the name of the file toget it. Finally type EXIT to leave the system.) (Paper 95PA01743,SIMMAX: A modern analog technique to deduce Atlantic sea surfacetemperatures from planktonic foraminifera in deep-sea sediments, UwePflaumann, Josette Duprat, Claude Pujol, and Laurent D. Labeyrie).Diskette may be ordered from American Geophysical Union, 2000Florida Avenue, N.W., Washington, DC 20009; Payment mustaccompany order.

  13. Chemistry Is Dead. Long Live Chemistry!

    PubMed

    Lavis, Luke D

    2017-10-03

    Chemistry, once king of fluorescence microscopy, was usurped by the field of fluorescent proteins. The increased demands of modern microscopy techniques on the "photon budget" require better and brighter fluorophores, causing a renewed interest in synthetic dyes. Here, we review the recent advances in biochemistry, protein engineering, and organic synthesis that have allowed a triumphant return of chemical fluorophores to modern biological imaging.

  14. [Watsu: a modern method in physiotherapy, body regeneration, and sports].

    PubMed

    Weber-Nowakowska, Katarzyna; Gebska, Magdalena; Zyzniewska-Banaszak, Ewelina

    2013-01-01

    Progress in existing methods of physiotherapy and body regeneration and introduction of new methods has made it possible to precisely select the techniques according to patient needs. The modern therapist is capable of improving the physical and mental condition of the patient. Watsu helps the therapist eliminate symptoms from the locomotor system and reach the psychic sphere at the same time.

  15. Graphic Poetry: How to Help Students Get the Most out of Pictures

    ERIC Educational Resources Information Center

    Chiang, River Ya-ling

    2013-01-01

    This paper attempts to give an account of some innovative work in paintings and modern poetry and to show how modern poets, such as Jane Flanders and Anne Sexton, the two American poets in particular, express and develop radically new conventions for their respective arts. Also elaborated are how such changes in artistic techniques are related to…

  16. Analytics and Action in Afghanistan

    DTIC Science & Technology

    2010-09-01

    rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology

  17. Europe Report, Science and Technology

    DTIC Science & Technology

    1986-09-30

    to certain basic products of the food industry such as beer, vinegar , 51 spirits, starches, etc. It is also assumed that modern biotechnologies...Czechoslovak food production. This is also the objective of innovative and modernizing programs in the fermented food sectors. The program for the...cattle and improves fodder utilization, assuming balanced doses of fodder. The development of fermentation techniques of production will occur within

  18. Virtual 3d City Modeling: Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Singh, S. P.; Jain, K.; Mandla, V. R.

    2013-08-01

    3D city model is a digital representation of the Earth's surface and it's related objects such as Building, Tree, Vegetation, and some manmade feature belonging to urban area. There are various terms used for 3D city models such as "Cybertown", "Cybercity", "Virtual City", or "Digital City". 3D city models are basically a computerized or digital model of a city contains the graphic representation of buildings and other objects in 2.5 or 3D. Generally three main Geomatics approach are using for Virtual 3-D City models generation, in first approach, researcher are using Conventional techniques such as Vector Map data, DEM, Aerial images, second approach are based on High resolution satellite images with LASER scanning, In third method, many researcher are using Terrestrial images by using Close Range Photogrammetry with DSM & Texture mapping. We start this paper from the introduction of various Geomatics techniques for 3D City modeling. These techniques divided in to two main categories: one is based on Automation (Automatic, Semi-automatic and Manual methods), and another is Based on Data input techniques (one is Photogrammetry, another is Laser Techniques). After details study of this, finally in short, we are trying to give the conclusions of this study. In the last, we are trying to give the conclusions of this research paper and also giving a short view for justification and analysis, and present trend for 3D City modeling. This paper gives an overview about the Techniques related with "Generation of Virtual 3-D City models using Geomatics Techniques" and the Applications of Virtual 3D City models. Photogrammetry, (Close range, Aerial, Satellite), Lasergrammetry, GPS, or combination of these modern Geomatics techniques play a major role to create a virtual 3-D City model. Each and every techniques and method has some advantages and some drawbacks. Point cloud model is a modern trend for virtual 3-D city model. Photo-realistic, Scalable, Geo-referenced virtual 3-D City model is a very useful for various kinds of applications such as for planning in Navigation, Tourism, Disasters Management, Transportations, Municipality, Urban Environmental Managements and Real-estate industry. So the Construction of Virtual 3-D city models is a most interesting research topic in recent years.

  19. A Study of the Applicability of Scientific Management Techniques for the Administration of Small, Private Church-Related Colleges.

    ERIC Educational Resources Information Center

    Ferguson, Albert S.

    Experiences with various modern management techniques and practices in selected small, private church-related colleges were studied. For comparative purposes, practices in public colleges and universities were also assessed. Management techniques used in small companies were identified through review of the literature and the management seminars…

  20. Remote Sensing the Thermal and Humidity Structure of the Earth's Atmosphere Using the GPS Radio Occultation Technique: Applications in Climate Studies

    NASA Astrophysics Data System (ADS)

    Vergados, P.; Mannucci, A. J.; Ao, C. O.; Verkhoglyadova, O. P.; Iijima, B.

    2017-12-01

    This presentation introduces the fundamentals of the Global Positioning System radio occultation (GPS RO) remote sensing technique in retrieving atmospheric temperature and humidity information and presents the use of these observations in climate research. Our objective is to demonstrate and establish the GPS RO remote sensing technique as a complementary data set to existing state-of-the-art space-based platforms for climate studies. We show how GPS RO measurements at 1.2-1.6 GHz frequency band can be used to infer the upper tropospheric water vapor and temperature feedbacks and we present a decade-long specific humidity (SH) record from January 2007 until December 2015. We cross-compare the GPS RO-estimated climate feedbacks and the SH long-record with independent data sets from the Modern-Era Retrospective Analysis for Research and Applications (MERRA), the European Center for Medium-range Weather Forecasts Re-Analysis Interim (ERA-Interim), and the Atmospheric Infrared Sounder (AIRS) instrument. These cross-comparisons serve as a performance guide for the GPS-RO observations with respect to other data sets by providing an independent measure of climate feedbacks and humidity short-term trends.

  1. Particle-In-Cell Analysis of an Electric Antenna for the BepiColombo/MMO spacecraft

    NASA Astrophysics Data System (ADS)

    Miyake, Yohei; Usui, Hideyuki; Kojima, Hirotsugu

    The BepiColombo/MMO spacecraft is planned to provide a first electric field measurement in Mercury's magnetosphere by mounting two types of the electric antennas: WPT and MEFISTO. The sophisticated calibration of such measurements should be performed based on precise knowledge of the antenna characteristics in space plasma. However, it is difficult to know prac-tical antenna characteristics considering the plasma kinetics and spacecraft-plasma interactions by means of theoretical approaches. Furthermore, some modern antenna designing techniques such as a "hockey puck" principle is applied to MEFISTO, which introduces much complexity in its overall configuration. Thus a strong demand arises regarding the establishment of a nu-merical method that can solve the complex configuration and plasma dynamics for evaluating the electric properties of the modern instrument. For the self-consistent antenna analysis, we have developed a particle simulation code named EMSES based on the particle-in-cell technique including a treatment antenna conductive sur-faces. In this paper, we mainly focus on electrostatic (ES) features and photoelectron distri-bution in the vicinity of MEFISTO. Our simulation model includes (1) a photoelectron guard electrode, (2) a bias current provided from the spacecraft body to the sensing element, (3) a floating potential treatment for the spacecraft body, and (4) photoelectron emission from sunlit surfaces of the conductive bodies. Of these, the photoelectron guard electrode is a key technol-ogy for producing an optimal condition of plasma environment around MEFISTO. Specifically, we introduced a pre-amplifier housing called puck located between the conductive boom and the sensor wire. The photoelectron guard is then simulated by forcibly fixing the potential difference between the puck surface and the spacecraft body. For the modeling, we use the Capacity Matrix technique in order to assure the conservation condition of total charge owned by the entire spacecraft body. We report some numerical analyses on the influence of the guard electrode on the surrounding plasma environment by using the developed model.

  2. Total Skin Electron Therapy for Cutaneous T-Cell Lymphoma Using a Modern Dual-Field Rotational Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heumann, Thatcher R.; Esiashvili, Natia; Winship Cancer Institute

    2015-05-01

    Purpose: To report our experience with rotational total skin electron irradiation (RTSEI) in cutaneous T-cell lymphoma (CTCL), and to examine response by disease stage and race. Methods and Materials: We reviewed our outcomes for 68 CTCL patients who received RTSEI (≥30 Gy) from 2000 to 2013. Primary outcomes were complete clinical response (CCR), recurrence-free survival (RFS), and overall survival (OS). Using log–rank tests and Cox proportional hazards, OS and RFS were compared across tumor stages at time of RTSEI with further racial subgroup analysis. Results: Median age at diagnosis and at time of radiation was 52 and 56 years, respectively.more » Median follow-up was 5.1 years, 49% were African American, and 49% were female. At time of treatment, 18, 37, and 13 patients were T stage 2, 3, and 4, respectively. At 6 weeks after RTSEI, overall CCR was 82% (88%, 83%, and 69% for T2, T3, and T4, respectively). Median RFS was 11 months for all patients and 14, 10, and 12 months for stage T2, T3, and T4, respectively. Tumor stage was not associated with RFS or CCR. Maintenance therapy after RTSEI was associated with improved RFS in both crude and multivariable analysis, controlling for T stage. Median OS was 76 months (91 and 59 months for T3 and T4, respectively). With the exception of improved OS in African Americans compared with whites at stage T2, race was not associated with CCR, RFS, or OS. Conclusions: These results represent the largest RTSEI clinical outcomes study in the modern era using a dual-field rotational technique. Our observed response rates match or improve upon the standard set by previous outcome studies using conventional TSEI techniques, despite a large percentage of advanced CTCL lesions in our cohort. We found that clinical response after RTSEI did not seem to be affected by T stage or race.« less

  3. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient. PMID:27370140

  4. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huq, M. Saiful, E-mail: HUQS@UPMC.EDU

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact ofmore » possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.« less

  5. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy safer and more efficient.

  6. Contemporary behavior management techniques in clinical pediatric dentistry: out with the old and in with the new?

    PubMed

    Oliver, Kelly; Manton, David John

    2015-01-01

    Effective behavior management guides children through the complex social context of dentistry utilizing techniques based on a current understanding of the social, emotional, and cognitive development of children. Behavior management techniques facilitate effective communication and establish social and behavioral guidelines for the dental environment. Contemporary parenting styles, expectations, and attitudes of modern parents and society have influenced the use of behavior management techniques with a prevailing emphasis on communicative techniques and pharmacological management over aversive techniques.

  7. Finite Element Based Optimization of Material Parameters for Enhanced Ballistic Protection

    NASA Astrophysics Data System (ADS)

    Ramezani, Arash; Huber, Daniel; Rothe, Hendrik

    2013-06-01

    The threat imposed by terrorist attacks is a major hazard for military installations, vehicles and other items. The large amounts of firearms and projectiles that are available, pose serious threats to military forces and even civilian facilities. An important task for international research and development is to avert danger to life and limb. This work will evaluate the effect of modern armor with numerical simulations. It will also provide a brief overview of ballistic tests in order to offer some basic knowledge of the subject, serving as a basis for the comparison of simulation results. The objective of this work is to develop and improve the modern armor used in the security sector. Numerical simulations should replace the expensive ballistic tests and find vulnerabilities of items and structures. By progressively changing the material parameters, the armor is to be optimized. Using a sensitivity analysis, information regarding decisive variables is yielded and vulnerabilities are easily found and eliminated afterwards. To facilitate the simulation, advanced numerical techniques have been employed in the analyses.

  8. Modern Focused-Ion-Beam-Based Site-Specific Specimen Preparation for Atom Probe Tomography.

    PubMed

    Prosa, Ty J; Larson, David J

    2017-04-01

    Approximately 30 years after the first use of focused ion beam (FIB) instruments to prepare atom probe tomography specimens, this technique has grown to be used by hundreds of researchers around the world. This past decade has seen tremendous advances in atom probe applications, enabled by the continued development of FIB-based specimen preparation methodologies. In this work, we provide a short review of the origin of the FIB method and the standard methods used today for lift-out and sharpening, using the annular milling method as applied to atom probe tomography specimens. Key steps for enabling correlative analysis with transmission electron-beam backscatter diffraction, transmission electron microscopy, and atom probe tomography are presented, and strategies for preparing specimens for modern microelectronic device structures are reviewed and discussed in detail. Examples are used for discussion of the steps for each of these methods. We conclude with examples of the challenges presented by complex topologies such as nanowires, nanoparticles, and organic materials.

  9. FLIC: High-Throughput, Continuous Analysis of Feeding Behaviors in Drosophila

    PubMed Central

    Pletcher, Scott D.

    2014-01-01

    We present a complete hardware and software system for collecting and quantifying continuous measures of feeding behaviors in the fruit fly, Drosophila melanogaster. The FLIC (Fly Liquid-Food Interaction Counter) detects analog electronic signals as brief as 50 µs that occur when a fly makes physical contact with liquid food. Signal characteristics effectively distinguish between different types of behaviors, such as feeding and tasting events. The FLIC system performs as well or better than popular methods for simple assays, and it provides an unprecedented opportunity to study novel components of feeding behavior, such as time-dependent changes in food preference and individual levels of motivation and hunger. Furthermore, FLIC experiments can persist indefinitely without disturbance, and we highlight this ability by establishing a detailed picture of circadian feeding behaviors in the fly. We believe that the FLIC system will work hand-in-hand with modern molecular techniques to facilitate mechanistic studies of feeding behaviors in Drosophila using modern, high-throughput technologies. PMID:24978054

  10. Understanding the Effects of Genotype, Growing Year, and Breeding on Tunisian Durum Wheat Allergenicity. 2. The Celiac Disease Case.

    PubMed

    Boukid, Fatma; Prandi, Barbara; Sforza, Stefano; Sayar, Rhouma; Seo, Yong Weon; Mejri, Mondher; Yacoubi, Ines

    2017-07-19

    The aim of this study was to compare immunogenic and toxic gluten peptides related to celiac disease (CD). 100 accessions of genotypes selected during the 20th century in Tunisia were in vitro digested and then analyzed by UPLC/ESI-MS technique using an isotopically labeled internal standard. The first MANOVA confirmed a high variability in the content of immunogenic and toxic peptides reflecting high genetic diversity in the germplasm released during the past century in Tunisia, consistently with PCA and clustering analysis results. Our finding showed also important variability in CD epitopes due to growing season's climate scenarios. Moreover, the second MANOVA revealed significant differences between abandoned and modern cultivars' CD-related peptide amounts. Although we could not conclude that there was an augment of allergens in newly selected durum wheat lines compared to abandoned ones, we demonstrated that modern genotype peptides were less sensitive to climate variation, which is a useful indicator for wheat breeders.

  11. Perceptions of rural women about contraceptive usage in district Khushab, Punjab.

    PubMed

    Tabassum, Aqeela; Manj, Yasir Nawaz; Gunjial, Tahira Rehman; Nazir, Salma

    2016-12-01

    To identify the perceptions of rural women about modern contraceptive methods and to ascertain the psycho-social and economic attitude of women about family planning methods. This cross-sectional study was conducted at the University of Sargodha, Sargodha, Pakistan, from December 2014 to March 2015, and comprised married women. The sample was selected using multistage sampling technique through Fitzgibbon table. They were interviewed regarding use of family planning methods. . SPSS 16 was used for data analysis. Of the 500 women, 358(71.6%) were never-users and 142(28.4%) were past-users of family planning methods. Moreover, 52(14.5%) of never-users did not know about a single modern contraceptive method. Of the past-users, 43(30.3%) knew about 1-3 methods and 99(69.7%) about 4 or more methods. Furthermore, 153(30.6%) respondents graded condoms as good, 261(55.2%) agreed that family planning helped in improving one's standard of living to a great extent while 453(90.6%) indicated that family planning methods were not expensive. Besides, 366(71.2%) respondents believed that using contraceptive method caused infertility. Dissatisfaction with methods, method failure, bad experiences with side effects, privacy concerns and different myths associated to the methods were strongly related to the non-usage of modern contraceptive methods.

  12. An investigative graduate laboratory course for teaching modern DNA techniques.

    PubMed

    de Lencastre, Alexandre; Thomas Torello, A; Keller, Lani C

    2017-07-08

    This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the Drosophila ortholog of a human disease gene of their choosing using Gateway ® cloning. In the second part of the course, students examine the expression of their gene of interest in human cell lines by reverse transcription PCR and learn how to analyze data from quantitative reverse transcription PCR (qRT-PCR) experiments. The adaptability of the Gateway ® cloning system is ideally suited for students to design and create different types of expression constructs to achieve a particular experimental goal (e.g., protein purification, expression in cell culture, and/or subcellular localization), and the genes chosen can be aligned to the research interests of the instructor and/or ongoing research in a department. Student evaluations indicate that the course fostered a genuine excitement for research and in depth knowledge of both the techniques performed and the theory behind them. Our long-term goal is to incorporate this DNA methods laboratory as the foundation for an integrated laboratory sequence for the Master of Science degree program in Molecular and Cellular Biology at Quinnipiac University, where students use the reagents and concepts they developed in this course in subsequent laboratory courses, including a protein methods and cell culture laboratory. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(4):351-359, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.

  13. Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine

    PubMed Central

    Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit

    2017-01-01

    In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804

  14. Fuzzy risk analysis of a modern γ-ray industrial irradiator.

    PubMed

    Castiglia, F; Giardina, M

    2011-06-01

    Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.

  15. Linear Calibration of Radiographic Mineral Density Using Video-Digitizing Methods

    NASA Technical Reports Server (NTRS)

    Martin, R. Bruce; Papamichos, Thomas; Dannucci, Greg A.

    1990-01-01

    Radiographic images can provide quantitative as well as qualitative information if they are subjected to densitometric analysis. Using modern video-digitizing techniques, such densitometry can be readily accomplished using relatively inexpensive computer systems. However, such analyses are made more difficult by the fact that the density values read from the radiograph have a complex, nonlinear relationship to bone mineral content. This article derives the relationship between these variables from the nature of the intermediate physical processes, and presents a simple mathematical method for obtaining a linear calibration function using a step wedge or other standard.

  16. Instruction set commutivity

    NASA Technical Reports Server (NTRS)

    Windley, P.

    1992-01-01

    We present a state property called congruence and show how it can be used to demonstrate commutivity of instructions in a modern load-store architecture. Our analysis is particularly important in pipelined microprocessors where instructions are frequently reordered to avoid costly delays in execution caused by hazards. Our work has significant implications to safety and security critical applications since reordering can easily change the meaning and an instruction sequence and current techniques are largely ad hoc. Our work is done in a mechanical theorem prover and results in a set of trustworthy rules for instruction reordering. The mechanization makes it practical to analyze the entire instruction set.

  17. Flexible multibody simulation of automotive systems with non-modal model reduction techniques

    NASA Astrophysics Data System (ADS)

    Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter

    2012-12-01

    The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.

  18. Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.

    2015-12-01

    Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.

  19. Vids: Version 2.0 Alpha Visualization Engine

    DTIC Science & Technology

    2018-04-25

    fidelity than existing efforts. Vids is a project aimed at producing more dynamic and interactive visualization tools using modern computer game ...move through and interact with the data to improve informational understanding. The Vids software leverages off-the-shelf modern game development...analysis and correlations. Recently, an ARL-pioneered project named Virtual Reality Data Analysis Environment (VRDAE) used VR and a modern game engine

  20. Merchandising Techniques and Libraries.

    ERIC Educational Resources Information Center

    Green, Sylvie A.

    1981-01-01

    Proposes that libraries employ modern booksellers' merchandising techniques to improve circulation of library materials. Using displays in various ways, the methods and reasons for weeding out books, replacing worn book jackets, and selecting new books are discussed. Suggestions for learning how to market and 11 references are provided. (RBF)

  1. Dance Critique as Signature Pedagogy

    ERIC Educational Resources Information Center

    Kearns, Lauren

    2017-01-01

    The curriculum of preprofessional university degree programs in dance typically comprise four components: theory and history, dance technique, creative process, and performance. This article focuses on critique in the modern dance technique and choreography components of the dance curriculum. Bachelor of Fine Arts programs utilize critique as a…

  2. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. From air to rubber: New techniques for measuring and replicating mouthpieces, bocals, and bores

    NASA Astrophysics Data System (ADS)

    Fuks, Leonardo

    2002-11-01

    The history of musical instruments comprises a long genealogy of models and prototypes that results from a combination of copying existing specimens with the change in constructive parameters, and the addition of new devices. In making wind instruments, several techniques have been traditionally employed for extracting the external and internal dimensions of toneholes, air columns, bells, and mouthpieces. In the twentieth century, methods such as pulse reflectometry, x-ray, magnetic resonance, and ultrasound imaging have been made available for bore measurement. Advantages and drawbacks of the existing methods are discussed and a new method is presented that makes use of the injection and coating of silicon rubber, for accurate molding of the instrument. This technique is harmless to all traditional materials, being indicated also for measurements of historical instruments. The paper presents dimensional data obtained from clarinet and saxophone mouthpieces. A set of replicas of top quality clarinet and saxophone mouthpieces, trombone bocals, and flute headjoints is shown, with comparative acoustical and performance analyses. The application of such techniques for historical and modern instrument analysis, restoration, and manufacturing is proposed.

  4. Development and use of molecular markers: past and present.

    PubMed

    Grover, Atul; Sharma, P C

    2016-01-01

    Molecular markers, due to their stability, cost-effectiveness and ease of use provide an immensely popular tool for a variety of applications including genome mapping, gene tagging, genetic diversity diversity, phylogenetic analysis and forensic investigations. In the last three decades, a number of molecular marker techniques have been developed and exploited worldwide in different systems. However, only a handful of these techniques, namely RFLPs, RAPDs, AFLPs, ISSRs, SSRs and SNPs have received global acceptance. A recent revolution in DNA sequencing techniques has taken the discovery and application of molecular markers to high-throughput and ultrahigh-throughput levels. Although, the choice of marker will obviously depend on the targeted use, microsatellites, SNPs and genotyping by sequencing (GBS) largely fulfill most of the user requirements. Further, modern transcriptomic and functional markers will lead the ventures onto high-density genetic map construction, identification of QTLs, breeding and conservation strategies in times to come in combination with other high throughput techniques. This review presents an overview of different marker technologies and their variants with a comparative account of their characteristic features and applications.

  5. Integration of optical measurement methods with flight parameter measurement systems

    NASA Astrophysics Data System (ADS)

    Kopecki, Grzegorz; Rzucidlo, Pawel

    2016-05-01

    During the AIM (advanced in-flight measurement techniques) and AIM2 projects, innovative modern techniques were developed. The purpose of the AIM project was to develop optical measurement techniques dedicated for flight tests. Such methods give information about aircraft elements deformation, thermal loads or pressure distribution, etc. In AIM2 the development of optical methods for flight testing was continued. In particular, this project aimed at the development of methods that could be easily applied in flight tests in an industrial setting. Another equally important task was to guarantee the synchronization of the classical measuring system with cameras. The PW-6U glider used in flight tests was provided by the Rzeszów University of Technology. The glider had all the equipment necessary for testing the IPCT (image pattern correlation technique) and IRT (infrared thermometry) methods. Additionally, equipment adequate for the measurement of typical flight parameters, registration and analysis has been developed. This article describes the designed system, as well as presenting the system’s application during flight tests. Additionally, the results obtained in flight tests show certain limitations of the IRT method as applied.

  6. The Problem of Reading and Reading Culture Improvement of Students-Bachelors of Elementary Education in Modern High Institution

    ERIC Educational Resources Information Center

    Kamalova, Lera A.; Koletvinova, Natal'ya D.

    2016-01-01

    This article is aimed to study the problems of reading and improve reading culture of students-bachelors of elementary education in modern high institutions and development of the most effective methods and techniques for improving of reading culture of students in the study of Humanities disciplines. The leading method to the study of this…

  7. Physiological patterns during practice of the Transcendental Meditation technique compared with patterns while reading Sanskrit and a modern language.

    PubMed

    Travis, F; Olson, T; Egenes, T; Gupta, H K

    2001-07-01

    This study tested the prediction that reading Vedic Sanskrit texts, without knowledge of their meaning, produces a distinct physiological state. We measured EEG, breath rate, heart rate, and skin conductance during: (1) 15-min Transcendental Meditation (TM) practice; (2) 15-min reading verses of the Bhagavad Gita in Sanskrit; and (3) 15-min reading the same verses translated in German, Spanish, or French. The two reading conditions were randomly counterbalanced, and subjects filled out experience forms between each block to reduce carryover effects. Skin conductance levels significantly decreased during both reading Sanskrit and TM practice, and increased slightly during reading a modern language. Alpha power and coherence were significantly higher when reading Sanskrit and during TM practice, compared to reading modern languages. Similar physiological patterns when reading Sanskrit and during practice of the TM technique suggests that the state gained during TM practice may be integrated with active mental processes by reading Sanskrit.

  8. [Construction of multiple drug release system based on components of traditional Chinese medicine].

    PubMed

    Liu, Dan; Jia, Xiaobin; Yu, Danhong; Zhang, Zhenhai; Sun, E

    2012-08-01

    With the development of the modernization drive of traditional Chinese medicine (TCM) preparations, new-type TCM dosage forms research have become a hot spot in the field. Because of complexity of TCM components as well as uncertainty of material base, there is still not a scientific system for modern TCM dosage forms so far. Modern TCM preparations inevitably take the nature of the multi-component and the general function characteristics of multi-link and multi-target into account. The author suggests building a multiple drug release system for TCM using diverse preparation techniques and drug release methods at levels on the basis the nature and function characteristics of TCM components. This essay expounds elaborates the ideas to build the multiple traditional Chinese medicine release system, theoretical basis, preparation techniques and assessment system, current problems and solutions, in order to build a multiple TCM release system with a view of enhancing the bioavailability of TCM components and provide a new form for TCM preparations.

  9. Sensitivity to imputation models and assumptions in receiver operating characteristic analysis with incomplete data

    PubMed Central

    Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.

    2015-01-01

    Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316

  10. Analysis of Glycosaminoglycans Using Mass Spectrometry

    PubMed Central

    Staples, Gregory O.; Zaia, Joseph

    2015-01-01

    The glycosaminoglycans (GAGs) are linear polysaccharides expressed on animal cell surfaces and in extracellular matrices. Their biosynthesis is under complex control and confers a domain structure that is essential to their ability to bind to protein partners. Key to understanding the functions of GAGs are methods to determine accurately and rapidly patterns of sulfation, acetylation and uronic acid epimerization that correlate with protein binding or other biological activities. Mass spectrometry (MS) is particularly suitable for the analysis of GAGs for biomedical purposes. Using modern ionization techniques it is possible to accurately determine molecular weights of GAG oligosaccharides and their distributions within a mixture. Methods for direct interfacing with liquid chromatography have been developed to permit online mass spectrometric analysis of GAGs. New tandem mass spectrometric methods for fine structure determination of GAGs are emerging. This review summarizes MS-based approaches for analysis of GAGs, including tissue extraction and chromatographic methods compatible with LC/MS and tandem MS. PMID:25705143

  11. Electron-beam-induced-current and active secondary-electron voltage-contrast with aberration-corrected electron probes

    DOE PAGES

    Han, Myung-Geun; Garlow, Joseph A.; Marshall, Matthew S. J.; ...

    2017-03-23

    The ability to map out electrostatic potentials in materials is critical for the development and the design of nanoscale electronic and spintronic devices in modern industry. Electron holography has been an important tool for revealing electric and magnetic field distributions in microelectronics and magnetic-based memory devices, however, its utility is hindered by several practical constraints, such as charging artifacts and limitations in sensitivity and in field of view. In this article, we report electron-beam-induced-current (EBIC) and secondary-electron voltage-contrast (SE-VC) with an aberration-corrected electron probe in a transmission electron microscope (TEM), as complementary techniques to electron holography, to measure electric fieldsmore » and surface potentials, respectively. These two techniques were applied to ferroelectric thin films, multiferroic nanowires, and single crystals. Electrostatic potential maps obtained by off-axis electron holography were compared with EBIC and SE-VC to show that these techniques can be used as a complementary approach to validate quantitative results obtained from electron holography analysis.« less

  12. Changing techniques in crop plant classification: molecularization at the National Institute of Agricultural Botany during the 1980s.

    PubMed

    Holmes, Matthew

    2017-04-01

    Modern methods of analysing biological materials, including protein and DNA sequencing, are increasingly the objects of historical study. Yet twentieth-century taxonomic techniques have been overlooked in one of their most important contexts: agricultural botany. This paper addresses this omission by harnessing unexamined archival material from the National Institute of Agricultural Botany (NIAB), a British plant science organization. During the 1980s the NIAB carried out three overlapping research programmes in crop identification and analysis: electrophoresis, near infrared spectroscopy (NIRS) and machine vision systems. For each of these three programmes, contemporary economic, statutory and scientific factors behind their uptake by the NIAB are discussed. This approach reveals significant links between taxonomic practice at the NIAB and historical questions around agricultural research, intellectual property and scientific values. Such links are of further importance given that the techniques developed by researchers at the NIAB during the 1980s remain part of crop classification guidelines issued by international bodies today.

  13. Protein Structural Analysis via Mass Spectrometry-Based Proteomics

    PubMed Central

    Artigues, Antonio; Nadeau, Owen W.; Rimmer, Mary Ashley; Villar, Maria T.; Du, Xiuxia; Fenton, Aron W.; Carlson, Gerald M.

    2017-01-01

    Modern mass spectrometry (MS) technologies have provided a versatile platform that can be combined with a large number of techniques to analyze protein structure and dynamics. These techniques include the three detailed in this chapter: 1) hydrogen/deuterium exchange (HDX), 2) limited proteolysis, and 3) chemical crosslinking (CX). HDX relies on the change in mass of a protein upon its dilution into deuterated buffer, which results in varied deuterium content within its backbone amides. Structural information on surface exposed, flexible or disordered linker regions of proteins can be achieved through limited proteolysis, using a variety of proteases and only small extents of digestion. CX refers to the covalent coupling of distinct chemical species and has been used to analyze the structure, function and interactions of proteins by identifying crosslinking sites that are formed by small multi-functional reagents, termed crosslinkers. Each of these MS applications is capable of revealing structural information for proteins when used either with or without other typical high resolution techniques, including NMR and X-ray crystallography. PMID:27975228

  14. Discovery of Newer Therapeutic Leads for Prostate Cancer

    DTIC Science & Technology

    2009-06-01

    promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of

  15. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  16. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  17. [Leonardo da Vinci the first human body imaging specialist. A brief communication on the thorax oseum images].

    PubMed

    Cicero, Raúl; Criales, José Luis; Cardoso, Manuel

    2009-01-01

    The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.

  18. Diffraction scattering computed tomography: a window into the structures of complex nanomaterials

    PubMed Central

    Birkbak, M. E.; Leemreize, H.; Frølich, S.; Stock, S. R.

    2015-01-01

    Modern functional nanomaterials and devices are increasingly composed of multiple phases arranged in three dimensions over several length scales. Therefore there is a pressing demand for improved methods for structural characterization of such complex materials. An excellent emerging technique that addresses this problem is diffraction/scattering computed tomography (DSCT). DSCT combines the merits of diffraction and/or small angle scattering with computed tomography to allow imaging the interior of materials based on the diffraction or small angle scattering signals. This allows, e.g., one to distinguish the distributions of polymorphs in complex mixtures. Here we review this technique and give examples of how it can shed light on modern nanoscale materials. PMID:26505175

  19. Digital pre-compensation techniques enabling high-capacity bandwidth variable transponders

    NASA Astrophysics Data System (ADS)

    Napoli, Antonio; Berenguer, Pablo Wilke; Rahman, Talha; Khanna, Ginni; Mezghanni, Mahdi M.; Gardian, Lennart; Riccardi, Emilio; Piat, Anna Chiadò; Calabrò, Stefano; Dris, Stefanos; Richter, André; Fischer, Johannes Karl; Sommerkorn-Krombholz, Bernd; Spinnler, Bernhard

    2018-02-01

    Digital pre-compensation techniques are among the enablers for cost-efficient high-capacity transponders. In this paper we describe various methods to mitigate the impairments introduced by state-of-the-art components within modern optical transceivers. Numerical and experimental results validate their performance and benefits.

  20. Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment

    ERIC Educational Resources Information Center

    Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James

    2010-01-01

    The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…

  1. Three Contributions of a Spiritual Perspective to Counseling, Psychotherapy, and Behavior Change.

    ERIC Educational Resources Information Center

    Bergin, Allen E.

    1988-01-01

    Describes ways in which a spiritual approach can contribute to the modern applied science of behavior change. Divides approach into three areas: conception of human nature, moral frame of reference, and set of techniques. Discusses and demonstrates the transitional person technique. (Author/BH)

  2. Analysis of 3D printing parameters of gears for hybrid manufacturing

    NASA Astrophysics Data System (ADS)

    Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz

    2018-05-01

    The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.

  3. PREFACE: 21th International Conference for Students and Young Scientists: Modern Technique and Technologies (MTT'2015)

    NASA Astrophysics Data System (ADS)

    2015-10-01

    Involving young researchers in the scientific process, and allowing them to gain scientific experience, are important issues for scientific development. The International Conference for Students and Young Scientists ''Modern Technique and Technologies'' is one of a number of scientific events, held at National Research Tomsk Polytechnic University aimed at training and forming the scientific elite. During previous years the conference established itself as a serious scientific event at an international level, attracting members which annually number about 400 students and young scientists from Russia and near and far abroad. An important indicator of this scientific event is the large number of scientific areas covered, such as power engineering, heat power engineering, electronic devices for monitoring and diagnostics, instrumentation, materials and technologies of new generations, methods of research and diagnostics of materials, automatic control and system engineering, physical methods science and engineering, design and artistic aspects of engineering, social and humanitarian aspects of engineering. The main issues, which are discussed at the conference by young researchers, are connected with analysis of contemporary problems, application of new techniques and technologies, and consideration of their relationship. Over the years, the conference committee has gained a lot of experience in organizing scientific meetings. There are all the necessary conditions: the staff of organizers includes employees of Tomsk Polytechnic University; the auditoriums are equipped with modern demonstration and office equipment; leading scientists are TPU professors; the status of the Tomsk Polytechnic University as a leading research university in Russia also plays an important role. All this allows collaboration between leading scientists from all around the world, who are annually invited to give lectures at the conference. The editorial board expresses gratitude to the Administration of Tomsk Polytechnic University (TPU Rector, Professor P.S. Chubik and Vice Rector for Research and Innovation, Professor A.N. Dyachenko) for financial support of the conference. Also, we heartily thank both chairmen of the conference sections and the organizing committee's members for the great, effective, creative work in organizing and developing the conference as well as a significant contribution to the safeguarding and replenishment of the intellectual potential of Russia.

  4. A Modernized Approach to Meet Diversified Earth Observing System (EOS) AM-1 Mission Requirements

    NASA Technical Reports Server (NTRS)

    Newman, Lauri Kraft; Hametz, Mark E.; Conway, Darrel J.

    1998-01-01

    From a flight dynamics perspective, the EOS AM-1 mission design and maneuver operations present a number of interesting challenges. The mission design itself is relatively complex for a low Earth mission, requiring a frozen, Sun-synchronous, polar orbit with a repeating ground track. Beyond the need to design an orbit that meets these requirements, the recent focus on low-cost, "lights out" operations has encouraged a shift to more automated ground support. Flight dynamics activities previously performed in special facilities created solely for that purpose and staffed by personnel with years of design experience are now being shifted to the mission operations centers (MOCs) staffed by flight operations team (FOT) operators. These operators' responsibilities include flight dynamics as a small subset of their work; therefore, FOT personnel often do not have the experience to make critical maneuver design decisions. Thus, streamlining the analysis and planning work required for such a complicated orbit design and preparing FOT personnel to take on the routine operation of such a spacecraft both necessitated increasing the automation level of the flight dynamics functionality. The FreeFlyer(trademark) software developed by AI Solutions provides a means to achieve both of these goals. The graphic interface enables users to interactively perform analyses that previously required many parametric studies and much data reduction to achieve the same result. In addition, the fuzzy logic engine .enables the simultaneous evaluation of multiple conflicting constraints, removing the analyst from the loop and allowing the FOT to perform more of the operations without much background in orbit design. Modernized techniques were implemented for EOS AM-1 flight dynamics support in several areas, including launch window determination, orbit maintenance maneuver control strategies, and maneuver design and calibration automation. The benefits of implementing these techniques include increased fuel available for on-orbit maneuvering, a simplified orbit maintenance process to minimize science data downtime, and an automated routine maneuver planning process. This paper provides an examination of the modernized techniques implemented for EOS AM-1 to achieve these benefits.

  5. A modernized approach to meet diversified earth observing system (EOS) AM-1 mission requirements

    NASA Technical Reports Server (NTRS)

    Newman, Lauri Kraft; Hametz, Mark E.; Conway, Darrel J.

    1998-01-01

    From a flight dynamics perspective, the EOS AM-1 mission design and maneuver operations present a number of interesting challenges. The mission design itself is relatively complex for a low Earth mission, requiring a frozen, Sun-synchronous, polar orbit with a repeating ground track. Beyond the need to design an orbit that meets these requirements, the recent focus on low-cost, 'lights out' operations has encouraged a shift to more automated ground support. Flight dynamics activities previously performed in special facilities created solely for that purpose and staffed by personnel with years of design experience are now being shifted to the mission operations centers (MOCs) staffed by flight operations team (FOT) operators. These operators' responsibilities include flight dynamics as a small subset of their work; therefore, FOT personnel often do not have the experience to make critical maneuver design decisions. Thus, streamlining the analysis and planning work required for such a complicated orbit design and preparing FOT personnel to take on the routine operation of such a spacecraft both necessitated increasing the automation level of the flight dynamics functionality. The FreeFlyer(TM) software developed by AI Solutions provides a means to achieve both of these goals. The graphic interface enables users to interactively perform analyses that previously required many parametric studies and much data reduction to achieve the same result In addition, the fuzzy logic engine enables the simultaneous evaluation of multiple conflicting constraints, removing the analyst from the loop and allowing the FOT to perform more of the operations without much background in orbit design. Modernized techniques were implemented for EOS AM-1 flight dynamics support in several areas, including launch window determination, orbit maintenance maneuver control strategies, and maneuver design and calibration automation. The benefits of implementing these techniques include increased fuel available for on-orbit maneuvering, a simplified orbit maintenance process to minimize science data downtime, and an automated routine maneuver planning process. This paper provides an examination of the modernized techniques implemented for EOS AM-1 to achieve these benefits.

  6. From middens to modern estuaries, oyster shells sequester source-specific nitrogen

    NASA Astrophysics Data System (ADS)

    Darrow, Elizabeth S.; Carmichael, Ruth H.; Andrus, C. Fred T.; Jackson, H. Edwin

    2017-04-01

    Oysters (Crassostrea virginica) were an important food resource for native peoples of the northern Gulf of Mexico, who deposited waste shells in middens. Nitrogen (N) stable isotopes (δ15N) in bivalve shells have been used as modern proxies for estuarine N sources because they approximate δ15N in suspended particulate matter. We tested the use of midden shell δ15N as a proxy for ancient estuarine N sources. We hypothesized that isotopic signatures in ancient shells from coastal Mississippi would differ from modern shells due to increased anthropogenic N sources, such as wastewater, through time. We decalcified shells using an acidification technique previously developed for modern bivalves, but modified to determine δ15N, δ13C, %N, and % organic C of these low-N, high-C specimens. The modified method resulted in the greatest percentage of usable data from midden shells. Our results showed that oyster shell δ15N did not significantly differ between ancient (500-2100 years old) and modern oysters from the same locations where the sites had undergone relatively little land-use change. δ15N values in modern shells, however, were positively correlated with water column nitrate concentrations associated with urbanization. When N content and total shell mass were combined, we estimated that middens sequestered 410-39,000 kg of relic N, buried at a rate of up to 5 kg N m-2 yr-1. This study provides a relatively simple technique to assess baseline conditions in ecosystems over long time scales by demonstrating that midden shells can be an indicator of pre-historic N source to estuaries and are a potentially significant but previously uncharacterized estuarine N sink.

  7. Seismic experiment ross ice shelf 1990/91: Characteristics of the seismic reflection data

    USGS Publications Warehouse

    1993-01-01

    The Transantarctic Mountains, with a length of 3000-3500 km and elevations of up to 4500 m, are one of the major Cenozoic mountain ranges in the world and are by far the most striking example of rift-shoulder mountains. Over the 1990-1991 austral summer Seismic Experiment Ross Ice Shelf (SERIS) was carried out across the Transantarctic Mountain front, between latitudes 82 degrees to 83 degrees S, in order to investigate the transition zone between the rifted area of the Ross Embayment and the uplifted Transantarctic Mountains. This experiment involved a 140 km long seismic reflection profile together with a 96 km long coincident wide-angle reflection/refraction profile. Gravity and relative elevation (using barometric pressure) were also measured along the profile. The primary purpose was to examine the boundary between the rift system and the uplifted rift margin (represented by the Transantarctic Mountains) using modern multi-channel crustal reflection/refraction techniques. The results provide insight into crustal structure across the plate boundary. SERIS also represented one of the first large-scale and modern multi-channel seismic experiments in the remote interior of Antarctica. As such, the project was designed to test different seismic acquisition techniques which will be involved in future seismic exploration of the continent. This report describes the results from the analysis of the acquisition tests as well as detailing some of the characteristics of the reflection seismic data. (auths.)

  8. Role of Green Spaces in Favorable Microclimate Creating in Urban Environment (Exemplified by Italian Cities)

    NASA Astrophysics Data System (ADS)

    Finaeva, O.

    2017-11-01

    The article represents a brief analysis of factors that influence the development of an urban green space system: territorial and climatic conditions, cultural and historical background as well as the modern strategy of historic cities development. The introduction defines the concept of urban greening, green spaces and green space distribution. The environmental parameters influenced by green spaces are determined. By the example of Italian cities the principles of the urban greening system development are considered: the historical aspects of formation of the urban greening system in Italian cities are analyzed, the role of green spaces in the formation of the urban environment structure and the creation of a favorable microclimate is determined, and a set of measures aimed at its improvement is highlighted. The modern principles of urban greening systems development and their characteristic features are considered. Special attention is paid to the interrelation of architectural and green structures in the formation of a favorable microclimate and psychological comfort in the urban environment; various methods of greening are considered by the example of existing architectural complexes depending on the climate of the area and the landscape features. The examples for the choice of plants and the application of compositional techniques are given. The results represent the basic principles of developing an urban green spaces system. The conclusion summarizes the techniques aimed at the microclimate improvement in the urban environment.

  9. Was Jack the Ripper a Slaughterman? Human-Animal Violence and the World’s Most Infamous Serial Killer

    PubMed Central

    Knight, Andrew; Watson, Katherine D.

    2017-01-01

    Simple Summary The identity of Jack the Ripper remains one of the greatest unsolved crime mysteries in history. Jack was notorious both for the brutality of his murders and also for his habit of stealing organs from his victims. His speed and skill in doing so, in conditions of poor light and haste, fueled theories he was a surgeon. However, re-examination of a mortuary sketch from one of his victims has revealed several key aspects that strongly suggest he had no professional surgical training. Instead, the technique used was more consistent with that of a slaughterhouse worker. There were many small-scale slaughterhouses in East London in the 1880s, within which conditions were harsh for animals and workers alike. The brutalizing effects of such work only add to concerns highlighted by modern research that those who commit violence on animals are more likely to target people. Modern slaughterhouses are more humane in some ways but more desensitizing in others, and sociological research has indicated that communities with slaughterhouses are more likely to experience the most violent of crimes. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. Abstract Hundreds of theories exist concerning the identity of “Jack the Ripper”. His propensity for anatomical dissection with a knife—and in particular the rapid location and removal of specific organs—led some to speculate that he must have been surgically trained. However, re-examination of a mortuary sketch of one of his victims has revealed several aspects of incisional technique highly inconsistent with professional surgical training. Related discrepancies are also apparent in the language used within the only letter from Jack considered to be probably authentic. The techniques he used to dispatch his victims and retrieve their organs were, however, highly consistent with techniques used within the slaughterhouses of the day. East London in the 1880s had a large number of small-scale slaughterhouses, within which conditions for both animals and workers were exceedingly harsh. Modern sociological research has highlighted the clear links between the infliction of violence on animals and that inflicted on humans, as well as increased risks of violent crimes in communities surrounding slaughterhouses. Conditions within modern slaughterhouses are more humane in some ways but more desensitising in others. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. PMID:28394281

  10. Modern radiosurgical and endovascular classification schemes for brain arteriovenous malformations.

    PubMed

    Tayebi Meybodi, Ali; Lawton, Michael T

    2018-05-04

    Stereotactic radiosurgery (SRS) and endovascular techniques are commonly used for treating brain arteriovenous malformations (bAVMs). They are usually used as ancillary techniques to microsurgery but may also be used as solitary treatment options. Careful patient selection requires a clear estimate of the treatment efficacy and complication rates for the individual patient. As such, classification schemes are an essential part of patient selection paradigm for each treatment modality. While the Spetzler-Martin grading system and its subsequent modifications are commonly used for microsurgical outcome prediction for bAVMs, the same system(s) may not be easily applicable to SRS and endovascular therapy. Several radiosurgical- and endovascular-based grading scales have been proposed for bAVMs. However, a comprehensive review of these systems including a discussion on their relative advantages and disadvantages is missing. This paper is dedicated to modern classification schemes designed for SRS and endovascular techniques.

  11. Surface texture measurement for dental wear applications

    NASA Astrophysics Data System (ADS)

    Austin, R. S.; Mullen, F.; Bartlett, D. W.

    2015-06-01

    The application of surface topography measurement and characterization within dental materials science is highly active and rapidly developing, in line with many modern industries. Surface measurement and structuring is used extensively within oral and dental science to optimize the optical, tribological and biological performance of natural and biomimetic dental materials. Although there has historically been little standardization in the use and reporting of surface metrology instrumentation and software, the dental industry is beginning to adopt modern areal measurement and characterization techniques, especially as the dental industry is increasingly adopting digital impressioning techniques in order to leverage CAD/CAM technologies for the design and construction of dental restorations. As dental treatment becomes increasingly digitized and reliant on advanced technologies such as dental implants, wider adoption of standardized surface topography and characterization techniques will become evermore essential. The dental research community welcomes the advances that are being made in surface topography measurement science towards realizing this ultimate goal.

  12. Genalogical approaches to ethical implications of informational assimilative integrated discovery systems (AIDS) in business

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pharhizgar, K.D.; Lunce, S.E.

    1994-12-31

    Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less

  13. [Professional divers: analysis of critical issues and proposal of a health protocol for work fitness].

    PubMed

    Pedata, Paola; Corvino, Anna Rita; Napolitano, Raffaele Carmine; Garzillo, Elpidio Maria; Furfaro, Ciro; Lamberti, Monica

    2016-01-20

    From many years now, thanks to the development of modern diving techniques, there has been a rapid spread of diving activities everywhere. In fact, divers are ever more numerous both among the Armed Forces and civilians who dive for work, like fishing, biological research and archeology. The aim of the study was to propose a health protocol for work fitness of professional divers keeping in mind the peculiar work activity, existing Italian legislation that is almost out of date and the technical and scientific evolution in this occupational field. We performed an analysis of the most frequently occurring diseases among professional divers and of the clinical investigation and imaging techniques used for work fitness assessment of professional divers. From analysis of the health protocol recommended by D.M. 13 January 1979 (Ministerial Decree), that is most used by occupational health physician, several critical issues emerged. Very often the clinical investigation and imaging techniques still used are almost obsolete, ignoring the execution of simple and inexpensive investigations that are more useful for work fitness assessment. Considering the out-dated legislation concerning diving disciplines, it is necessary to draw up a common health protocol that takes into account clinical and scientific knowledge and skills acquired in this area. This protocol's aim is to propose a useful tool for occupational health physicians who work in this sector.

  14. Practical Guidance for Conducting Mediation Analysis With Multiple Mediators Using Inverse Odds Ratio Weighting

    PubMed Central

    Nguyen, Quynh C.; Osypuk, Theresa L.; Schmidt, Nicole M.; Glymour, M. Maria; Tchetgen Tchetgen, Eric J.

    2015-01-01

    Despite the recent flourishing of mediation analysis techniques, many modern approaches are difficult to implement or applicable to only a restricted range of regression models. This report provides practical guidance for implementing a new technique utilizing inverse odds ratio weighting (IORW) to estimate natural direct and indirect effects for mediation analyses. IORW takes advantage of the odds ratio's invariance property and condenses information on the odds ratio for the relationship between the exposure (treatment) and multiple mediators, conditional on covariates, by regressing exposure on mediators and covariates. The inverse of the covariate-adjusted exposure-mediator odds ratio association is used to weight the primary analytical regression of the outcome on treatment. The treatment coefficient in such a weighted regression estimates the natural direct effect of treatment on the outcome, and indirect effects are identified by subtracting direct effects from total effects. Weighting renders treatment and mediators independent, thereby deactivating indirect pathways of the mediators. This new mediation technique accommodates multiple discrete or continuous mediators. IORW is easily implemented and is appropriate for any standard regression model, including quantile regression and survival analysis. An empirical example is given using data from the Moving to Opportunity (1994–2002) experiment, testing whether neighborhood context mediated the effects of a housing voucher program on obesity. Relevant Stata code (StataCorp LP, College Station, Texas) is provided. PMID:25693776

  15. Recent developments in minimal processing: a tool to retain nutritional quality of food.

    PubMed

    Pasha, Imran; Saeed, Farhan; Sultan, M Tauseef; Khan, Moazzam Rafiq; Rohi, Madiha

    2014-01-01

    The modernization during the last century resulted in urbanization coupled with modifications in lifestyles and dietary habits. In the same era, industrial developments made it easier to meet the requirements for processed foods. However, consumers are now interested in minimally processed foods owing to increase in their awareness to have fruits and vegetables with superior quality, and natural integrity with fewer additives. The food products deteriorate as a consequence of physiological aging, biochemical changes, high respiration rat,e and high ethylene production. These factors contribute substantially to discoloration, loss of firmness, development of off-flavors, acidification, and microbial spoilage. Simultaneously, food processors are using emerging approaches to process perishable commodities, along with enhanced nutritional and sensorial quality. The present review article is an effort to utilize the modern approaches to minimize the processing and deterioration. The techniques discussed in this paper include chlorination, ozonation, irradiation, photosensitization, edible coating, natural preservative use, high-pressure processing, microwave heating, ohmic heating, and hurdle technology. The consequences of these techniques on shelf-life stability, microbial safety, preservation of organoleptic and nutritional quality, and residue avoidance are the limelight of the paper. Moreover, the discussion has been made on the feasibility and operability of these techniques in modern-day processing.

  16. Risk factors for musculoskeletal injury in elite pre-professional modern dancers: A prospective cohort prognostic study.

    PubMed

    Bronner, Shaw; Bauer, Naomi G

    2018-05-01

    To examine risk factors for injury in pre-professional modern dancers. With prospectively designed screening and injury surveillance, we evaluated four risk factors as categorical predictors of injury: i) hypermobility; ii) dance technique motor-control; iii) muscle tightness; iv) previous injury. Screening and injury data of 180 students enrolled in a university modern dance program were reviewed over 4-yrs of training. Dancers were divided into 3-groups based on predictor scores. Dance exposure was based on hours of technique classes/wk. Negative binomial log-linear analyses were conducted with the four predictors, p < 0.05. Dancers with low and high Beighton scores were 1.43 and 1.22 times more likely to sustain injury than dancers with mid-range scores (p ≤ 0.03). Dancers with better technique (low or medium scores) were 0.86 and 0.63 times less likely to sustain injury (p = 0.013 and p < 0.001) compared to those with poor technique. Dancers with one or 2-4 tight muscles were 2.7 and 4.0 times more likely to sustain injury (p ≤ 0.046). Dancers who sustained 2-4 injuries in the previous year were 1.38 times more likely to sustain subsequent injury (p < 0.001). This contributes new information on the value of preseason screening. Dancers with these risk factors may benefit from prevention programs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. An Image-Based Modeling Experience about Social Facilities, Built during the Fascist Period in Middle Italy

    NASA Astrophysics Data System (ADS)

    Rossi, D.

    2011-09-01

    The main focus of this article is to explain a teaching activity. This experience follows a research aimed to testing innovative systems for formal and digital analysis of architectural building. In particular, the field of investigation is the analytical drawing. An analytical draw allows to develope an interpretative and similar models of reality; these models are built using photomodeling techniques and are designed to re-write modern and contemporary architecture. The typology of the buildings surveyed belong to a cultural period, called Modern Movement, historically placed between the two world wars. The Modern Movement aimed to renew existing architectural principle and to a functional redefinition of the same one. In Italy these principles arrived during the Fascist period. Heritage made up of public social buildings (case del Balilla, G.I.L., recreation center...) built during the fascist period in middle Italy is remarkable for quantity and in many cases for architectural quality. This kind of buildings are composed using pure shapes: large cube (gyms) alternate with long rectangular block containing offices creates compositions made of big volumes and high towers. These features are perfectly suited to the needs of a surveying process by photomodeling where the role of photography is central and where there is the need to identify certain and easily distinguishable points on all picture, leaning on the edges of the volume or lininig on the texture discontinuity. The goal is the documentation to preserve and to develop buildings and urban complexes of modern architecture, directed to encourage an artistic preservation.

  18. NDS modernization project - requirements analysis report

    DOT National Transportation Integrated Search

    1997-04-09

    The National Distress System (NDS) Modernization Project envisions replacing/modernizing the present VHF-FM based system with an integrated state-of-the-art commercial/government-off- : the-shelf (COTS/GOTS) or non-developmental item (NDI) solution. ...

  19. Laser assisted microdissection, an efficient technique to understand tissue specific gene expression patterns and functional genomics in plants.

    PubMed

    Gautam, Vibhav; Sarkar, Ananda K

    2015-04-01

    Laser assisted microdissection (LAM) is an advanced technology used to perform tissue or cell-specific expression profiling of genes and proteins, owing to its ability to isolate the desired tissue or cell type from a heterogeneous population. Due to the specificity and high efficiency acquired during its pioneering use in medical science, the LAM technique has quickly been adopted for use in many biological researches. Today, it has become a potent tool to address a wide range of questions in diverse field of plant biology. Beginning with comparative transcriptome analysis of different tissues such as reproductive parts, meristems, lateral organs, roots etc., LAM has also been extensively used in plant-pathogen interaction studies, proteomics, and metabolomics. In combination with next generation sequencing and proteomics analysis, LAM has opened up promising opportunities in the area of large scale functional studies in plants. Ever since the advent of this technique, significant improvements have been achieved in term of its instrumentation and method, which has made LAM a more efficient tool applicable in wider research areas. Here, we discuss the advancement of LAM technique with special emphasis on its methodology and highlight its scope in modern research areas of plant biology. Although we put emphasis on use of LAM in transcriptome studies, which is mostly used, we also discuss its recent application and scope in proteome and metabolome studies.

  20. Pastoral Techniques in the Modern Danish Educational System

    ERIC Educational Resources Information Center

    Nielsen, Klaus; Dalgaard, Susanne; Madsen, Sarah

    2011-01-01

    In recent years, therapeutic techniques have played an increasingly significant role in Danish educational thinking. One way in which this therapeutic thinking discloses itself is in the ever-growing use of educational-therapeutic games as part of the educational practice. Inspired by Foucault, we argue that educational-therapeutic games can be…

  1. Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, S. J.; George, R.; Bush, B.

    2009-04-29

    This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.

  2. Making an Old Measurement Experiment Modern and Exciting!

    ERIC Educational Resources Information Center

    Schulze, Paul D.

    1996-01-01

    Presents a new approach for the determination of the temperature coefficient of resistance of a resistor and a thermistor. Advantages include teaching students how to linearize data in order to utilize least-squares techniques, continuously taking data over desired temperature range, using up-to-date data-acquisition techniques, teaching the use…

  3. Manual Solid-Phase Peptide Synthesis of Metallocene-Peptide Bioconjugates

    ERIC Educational Resources Information Center

    Kirin, Srecko I.; Noor, Fozia; Metzler-Nolte, Nils; Mier, Walter

    2007-01-01

    A simple and relatively inexpensive procedure for preparing a biologically active peptide using solid phase peptide synthesis (SPPS) is described. Fourth-year undergraduate students have gained firsthand experience from the solid-phase synthesis techniques and they have become familiar with modern analytical techniques based on the particular…

  4. A New Dialogue in Ballet Pedagogy: Improving Learner Self-Sufficiency through Reflective Methodology

    ERIC Educational Resources Information Center

    Weidmann, Chelsea

    2018-01-01

    Current research into reflective pedagogy in dance almost exclusively discusses the tertiary education population. Additionally, the research is primarily focused on concert modern dance and creative dance pedagogies, techniques, and choreography. Ballet technique programs in precollegiate populations have, so far, been left out of the…

  5. An active antenna for ELF magnetic fields

    NASA Technical Reports Server (NTRS)

    Sutton, John F.; Spaniol, Craig

    1994-01-01

    The work of Nikola Tesla, especially that directed toward world-wide electrical energy distribution via excitation of the earth-ionosphere cavity resonances, has stimulated interest in the study of these resonances. Not only are they important for their potential use in the transmission of intelligence and electrical power, they are important because they are an integral part of our natural environment. This paper describes the design of a sensitive, untuned, low noise active antenna which is uniquely suited to modern earth-ionosphere cavity resonance measurements employing fast-Fourier transform techniques for near-real-time data analysis. It capitalizes on a little known field-antenna interaction mechanism. Recently, the authors made preliminary measurements of the magnetic fields in the earth-ionosphere cavity. During the course of this study, the problem of designing an optimized ELF magnetic field sensor presented itself. The sensor would have to be small, light weight (for portable use), and capable of detecting the 5-50 Hz picoTesla-level signals generated by the natural excitations of the earth-ionosphere cavity resonances. A review of the literature revealed that past researchers had employed very large search coils, both tuned and untuned. Hill and Bostick, for example, used coils of 30,000 turns wound on high permeability cores of 1.83 m length, weighing 40 kg. Tuned coils are unsuitable for modern fast-Fourier transform data analysis techniques which require a broad spectrum input. 'Untuned' coils connected to high input impedance voltage amplifiers exhibit resonant responses at the resonant frequency determined by the coil inductance and the coil distributed winding capacitance. Also, considered as antennas, they have effective areas equal only to their geometrical areas.

  6. Spreadsheet WATERSHED modeling for nonpoint-source pollution management in a Wisconsin basin

    USGS Publications Warehouse

    Walker, J.F.; Pickard, S.A.; Sonzogni, W.C.

    1989-01-01

    Although several sophisticated nonpoint pollution models exist, few are available that are easy to use, cover a variety of conditions, and integrate a wide range of information to allow managers and planners to assess different control strategies. Here, a straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.A straightforward pollutant input accounting approach is presented in the form of an existing model (WATERSHED) that has been adapted to run on modern electronic spreadsheets. As an application, WATERSHED is used to assess options to improve the quality of highly eutrophic Delavan Lake in Wisconsin. WATERSHED is flexible in that several techniques, such as the Universal Soil Loss Equation or unit-area loadings, can be used to estimate nonpoint-source inputs. Once the model parameters are determined (and calibrated, if possible), the spreadsheet features can be used to conduct a sensitivity analysis of management options. In the case of Delavan Lake, it was concluded that, although some nonpoint controls were cost-effective, the overall reduction in phosphorus would be insufficient to measurably improve water quality.

  7. The R-package eseis - A toolbox to weld geomorphic, seismologic, spatial, and time series analysis

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2017-04-01

    Environmental seismology is the science of investigating the seismic signals that are emitted by Earth surface processes. This emerging field provides unique opportunities to identify, locate, track and inspect a wide range of the processes that shape our planet. Modern broadband seismometers are sensitive enough to detect signals from sources as weak as wind interacting with the ground and as powerful as collapsing mountains. This places the field of environmental seismology at the seams of many geoscientific disciplines and requires integration of a series of specialised analysis techniques. R provides the perfect environment for this challenge. The package eseis uses the foundations laid by a series of existing packages and data types tailored to solve specialised problems (e.g., signal, sp, rgdal, Rcpp, matrixStats) and thus provides access to efficiently handling large streams of seismic data (> 300 million samples per station and day). It supports standard data formats (mseed, sac), preparation techniques (deconvolution, filtering, rotation), processing methods (spectra, spectrograms, event picking, migration for localisation) and data visualisation. Thus, eseis provides a seamless approach to the entire workflow of environmental seismology and passes the output to related analysis fields with temporal, spatial and modelling focus in R.

  8. Femtosecond laser ablation-based mass spectrometry. An ideal tool for stoichiometric analysis of thin films

    DOE PAGES

    LaHaye, Nicole L.; Kurian, Jose; Diwakar, Prasoon K.; ...

    2015-08-19

    An accurate and routinely available method for stoichiometric analysis of thin films is a desideratum of modern materials science where a material’s properties depend sensitively on elemental composition. We thoroughly investigated femtosecond laser ablation-inductively coupled plasma-mass spectrometry (fs-LA-ICP-MS) as an analytical technique for determination of the stoichiometry of thin films down to the nanometer scale. The use of femtosecond laser ablation allows for precise removal of material with high spatial and depth resolution that can be coupled to an ICP-MS to obtain elemental and isotopic information. We used molecular beam epitaxy-grown thin films of LaPd (x)Sb 2 and T´-La 2CuOmore » 4 to demonstrate the capacity of fs-LA-ICP-MS for stoichiometric analysis and the spatial and depth resolution of the technique. Here we demonstrate that the stoichiometric information of thin films with a thickness of ~10 nm or lower can be determined. Furthermore, our results indicate that fs-LA-ICP-MS provides precise information on the thin film-substrate interface and is able to detect the interdiffusion of cations.« less

  9. A review of post-modern management techniques as currently applied to Turkish forestry.

    PubMed

    Dölarslan, Emre Sahin

    2009-01-01

    This paper reviews the effects of six post-modern management concepts as applied to Turkish forestry. Up to now, Turkish forestry has been constrained, both in terms of its operations and internal organization, by a highly bureaucratic system. The application of new thinking in forestry management, however, has recently resulted in new organizational and production concepts that promise to address problems specific to this Turkish industry and bring about positive changes. This paper will elucidate these specific issues and demonstrate how post-modern management thinking is influencing the administration and operational capacity of Turkish forestry within its current structure.

  10. Analysis of Multi-Arm Caliper Data for the U.S. Strategic Petroleum Reserve

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, Barry L.

    The U.S. Strategic Petroleum Reserve (SPR) has an increasing reliance on mul t i - arm caliper surveys to assess the integrity of casing for cavern access wells and to determine priorities for casing remediation. Multi - arm caliper (MAC) surveys provide a view of well casing deformation by reporting radial measurements of the inner cas ing wall as the tool is drawn through the casing. Over the last several years the SPR has collected a large number of modern MAC surveys. In total, these surveys account for over 100 million individual measurements. The surveys were collected using diff eringmore » survey vendors and survey hardware. This has resulted in a collection of disparate data sets which confound attempts to make well - to - well or time - dependent evaluations. In addition, the vendor supplied MAC interpretations often involve variables wh ich are not well defined or which may not be applicable to casings for cavern access wells. These factors reduce the usability of these detailed data sets. In order to address this issue and provide an independent analysis of multi - arm caliper survey data, Sandia National Labs has developed processing techniques and analysis variables which allow for the comparison of MAC survey data regardless of the source of the survey data. These techniques use the raw radial arm information and newly developed analysis variables to assess the casing status and provide a means for well - to - well and time - dependent analyses. Well - to - well and t ime - dependent investigation of the MAC survey data provide s information to prioritize well remediation activities and identify wells with integrity issues. This paper presents the challenges in using disparate MAC survey data, techniques developed to address these challenges and some o f the insights gained from these new techniques.« less

  11. Inertial Confinement fusion targets

    NASA Technical Reports Server (NTRS)

    Hendricks, C. D.

    1982-01-01

    Inertial confinement fusion (ICF) targets are made as simple flat discs, as hollow shells or as complicated multilayer structures. Many techniques were devised for producing the targets. Glass and metal shells are made by using drop and bubble techniques. Solid hydrogen shells are also produced by adapting old methods to the solution of modern problems. Some of these techniques, problems, and solutions are discussed. In addition, the applications of many of the techniques to fabrication of ICF targets is presented.

  12. Precision cosmological parameter estimation

    NASA Astrophysics Data System (ADS)

    Fendt, William Ashton, Jr.

    2009-09-01

    Experimental efforts of the last few decades have brought. a golden age to mankind's endeavor to understand tine physical properties of the Universe throughout its history. Recent measurements of the cosmic microwave background (CMB) provide strong confirmation of the standard big bang paradigm, as well as introducing new mysteries, to unexplained by current physical models. In the following decades. even more ambitious scientific endeavours will begin to shed light on the new physics by looking at the detailed structure of the Universe both at very early and recent times. Modern data has allowed us to begins to test inflationary models of the early Universe, and the near future will bring higher precision data and much stronger tests. Cracking the codes hidden in these cosmological observables is a difficult and computationally intensive problem. The challenges will continue to increase as future experiments bring larger and more precise data sets. Because of the complexity of the problem, we are forced to use approximate techniques and make simplifying assumptions to ease the computational workload. While this has been reasonably sufficient until now, hints of the limitations of our techniques have begun to come to light. For example, the likelihood approximation used for analysis of CMB data from the Wilkinson Microwave Anistropy Probe (WMAP) satellite was shown to have short falls, leading to pre-emptive conclusions drawn about current cosmological theories. Also it can he shown that an approximate method used by all current analysis codes to describe the recombination history of the Universe will not be sufficiently accurate for future experiments. With a new CMB satellite scheduled for launch in the coming months, it is vital that we develop techniques to improve the analysis of cosmological data. This work develops a novel technique of both avoiding the use of approximate computational codes as well as allowing the application of new, more precise analysis methods. These techniques will help in the understanding of new physics contained in current and future data sets as well as benefit the research efforts of the cosmology community. Our idea is to shift the computationally intensive pieces of the parameter estimation framework to a parallel training step. We then provide a machine learning code that uses this training set to learn the relationship between the underlying cosmological parameters and the function we wish to compute. This code is very accurate and simple to evaluate. It can provide incredible speed- ups of parameter estimation codes. For some applications this provides the convenience of obtaining results faster, while in other cases this allows the use of codes that would be impossible to apply in the brute force setting. In this thesis we provide several examples where our method allows more accurate computation of functions important for data analysis than is currently possible. As the techniques developed in this work are very general, there are no doubt a wide array of applications both inside and outside of cosmology. We have already seen this interest as other scientists have presented ideas for using our algorithm to improve their computational work, indicating its importance as modern experiments push forward. In fact, our algorithm will play an important role in the parameter analysis of Planck, the next generation CMB space mission.

  13. cudaMap: a GPU accelerated program for gene expression connectivity mapping

    PubMed Central

    2013-01-01

    Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap. PMID:24112435

  14. A Century of Enzyme Kinetic Analysis, 1913 to 2013

    PubMed Central

    Johnson, Kenneth A.

    2013-01-01

    This review traces the history and logical progression of methods for quantitative analysis of enzyme kinetics from the 1913 Michaelis and Menten paper to the application of modern computational methods today. Following a brief review of methods for fitting steady state kinetic data, modern methods are highlighted for fitting full progress curve kinetics based upon numerical integration of rate equations, including a re-analysis of the original Michaelis-Menten full time course kinetic data. Finally, several illustrations of modern transient state kinetic methods of analysis are shown which enable the elucidation of reactions occurring at the active sites of enzymes in order to relate structure and function. PMID:23850893

  15. A Chromosome-Scale Assembly of the Bactrocera cucurbitae Genome Provides Insight to the Genetic Basis of white pupae

    PubMed Central

    Sim, Sheina B.; Geib, Scott M.

    2017-01-01

    Genetic sexing strains (GSS) used in sterile insect technique (SIT) programs are textbook examples of how classical Mendelian genetics can be directly implemented in the management of agricultural insect pests. Although the foundation of traditionally developed GSS are single locus, autosomal recessive traits, their genetic basis are largely unknown. With the advent of modern genomic techniques, the genetic basis of sexing traits in GSS can now be further investigated. This study is the first of its kind to integrate traditional genetic techniques with emerging genomics to characterize a GSS using the tephritid fruit fly pest Bactrocera cucurbitae as a model. These techniques include whole-genome sequencing, the development of a mapping population and linkage map, and quantitative trait analysis. The experiment designed to map the genetic sexing trait in B. cucurbitae, white pupae (wp), also enabled the generation of a chromosome-scale genome assembly by integrating the linkage map with the assembly. Quantitative trait loci analysis revealed SNP loci near position 42 MB on chromosome 3 to be tightly linked to wp. Gene annotation and synteny analysis show a near perfect relationship between chromosomes in B. cucurbitae and Muller elements A–E in Drosophila melanogaster. This chromosome-scale genome assembly is complete, has high contiguity, was generated using a minimal input DNA, and will be used to further characterize the genetic mechanisms underlying wp. Knowledge of the genetic basis of genetic sexing traits can be used to improve SIT in this species and expand it to other economically important Diptera. PMID:28450369

  16. Modern control techniques in active flutter suppression using a control moment gyro

    NASA Technical Reports Server (NTRS)

    Buchek, P. M.

    1974-01-01

    Development of organized synthesis techniques, using concepts of modern control theory was studied for the design of active flutter suppression systems for two and three-dimensional lifting surfaces, utilizing a control moment gyro (CMG) to generate the required control torques. Incompressible flow theory is assumed, with the unsteady aerodynamic forces and moments for arbitrary airfoil motion obtained by using the convolution integral based on Wagner's indicial lift function. Linear optimal control theory is applied to find particular optimal sets of gain values which minimize a quadratic performance function. The closed loop system's response to impulsive gust disturbances and the resulting control power requirements are investigated, and the system eigenvalues necessary to minimize the maximum value of control power are determined.

  17. [Analysis of images in Japanese book Fukusho-Kiran (Medical Book Focusing on Abdominal Palpation) and Fukusho-Kiran yoku (Supplement to Medical Book Focusing on Abdominal Palpation)].

    PubMed

    Zhang, Lijun; DI, Kan; Song, Yuanliang

    2014-09-01

    Hukusyo-kiran (Medical Book Focusing on Abdominal Palpation) and Hukusyo-kiran yoku (Supplement to Medical Book Focusing on Abdominal Palpation) are two typical monographs of Fukushin (abdominal palpation), with a total of 148 images about abdominal palpation. These images can be divided into 5 kinds: locations, theories, techniques, diseases and medicines, with its own system covering the theories, principles, prescriptions and medicines of abdominal palpation. It can be used as a guide for clinicians to differentiate the locations and qualities of diseases, confirm the principles of treatment, guide the usage of medicines, and predict the prognosis, with the rather high theoretic and academic value, deserving further research and analysis for the modern scholars.

  18. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  19. A strategy for analysis of (molecular) equilibrium simulations: Configuration space density estimation, clustering, and visualization

    NASA Astrophysics Data System (ADS)

    Hamprecht, Fred A.; Peter, Christine; Daura, Xavier; Thiel, Walter; van Gunsteren, Wilfred F.

    2001-02-01

    We propose an approach for summarizing the output of long simulations of complex systems, affording a rapid overview and interpretation. First, multidimensional scaling techniques are used in conjunction with dimension reduction methods to obtain a low-dimensional representation of the configuration space explored by the system. A nonparametric estimate of the density of states in this subspace is then obtained using kernel methods. The free energy surface is calculated from that density, and the configurations produced in the simulation are then clustered according to the topography of that surface, such that all configurations belonging to one local free energy minimum form one class. This topographical cluster analysis is performed using basin spanning trees which we introduce as subgraphs of Delaunay triangulations. Free energy surfaces obtained in dimensions lower than four can be visualized directly using iso-contours and -surfaces. Basin spanning trees also afford a glimpse of higher-dimensional topographies. The procedure is illustrated using molecular dynamics simulations on the reversible folding of peptide analoga. Finally, we emphasize the intimate relation of density estimation techniques to modern enhanced sampling algorithms.

  20. Comparison of rapid methods for chemical analysis of milligram samples of ultrafine clays

    USGS Publications Warehouse

    Rettig, S.L.; Marinenko, J.W.; Khoury, Hani N.; Jones, B.F.

    1983-01-01

    Two rapid methods for the decomposition and chemical analysis of clays were adapted for use with 20–40-mg size samples, typical amounts of ultrafine products (≤0.5-µm diameter) obtained by modern separation methods for clay minerals. The results of these methods were compared with those of “classical” rock analyses. The two methods consisted of mixed lithium metaborate fusion and heated decomposition with HF in a closed vessel. The latter technique was modified to include subsequent evaporation with concentrated H2SO4 and re-solution in HCl, which reduced the interference of the fluoride ion in the determination of Al, Fe, Ca, Mg, Na, and K. Results from the two methods agree sufficiently well with those of the “classical” techniques to minimize error in the calculation of clay mineral structural formulae. Representative maximum variations, in atoms per unit formula of the smectite type based on 22 negative charges, are 0.09 for Si, 0.03 for Al, 0.015 for Fe, 0.07 for Mg, 0.03 for Na, and 0.01 for K.

  1. Longitudinal changes in the visual field and optic disc in glaucoma.

    PubMed

    Artes, Paul H; Chauhan, Balwantray C

    2005-05-01

    The nature and mode of functional and structural progression in open-angle glaucoma is a subject of considerable debate in the literature. While there is a traditionally held viewpoint that optic disc and/or nerve fibre layer changes precede visual field changes, there is surprisingly little published evidence from well-controlled prospective studies in this area, specifically with modern perimetric and imaging techniques. In this paper, we report on clinical data from both glaucoma patients and normal controls collected prospectively over several years, to address the relationship between visual field and optic disc changes in glaucoma using standard automated perimetry (SAP), high-pass resolution perimetry (HRP) and confocal scanning laser tomography (CSLT). We use several methods of analysis of longitudinal data and describe a new technique called "evidence of change" analysis which facilitates comparison between different tests. We demonstrate that current clinical indicators of visual function (SAP and HRP) and measures of optic disc structure (CSLT) provide largely independent measures of progression. We discuss the reasons for these findings as well as several methodological issues that pose challenges to elucidating the true structure-function relationship in glaucoma.

  2. The Modern Design of Experiments: A Technical and Marketing Framework

    NASA Technical Reports Server (NTRS)

    DeLoach, R.

    2000-01-01

    A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.

  3. Modern Instrumental Methods in Forensic Toxicology*

    PubMed Central

    Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.

    2009-01-01

    This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968

  4. Quantitative estimation of climatic parameters from vegetation data in North America by the mutual climatic range technique

    USGS Publications Warehouse

    Anderson, Katherine H.; Bartlein, Patrick J.; Strickland, Laura E.; Pelltier, Richard T.; Thompson, Robert S.; Shafer, Sarah L.

    2012-01-01

    The mutual climatic range (MCR) technique is perhaps the most widely used method for estimating past climatic parameters from fossil assemblages, largely because it can be conducted on a simple list of the taxa present in an assemblage. When applied to plant macrofossil data, this unweighted approach (MCRun) will frequently identify a large range for a given climatic parameter where the species in an assemblage can theoretically live together. To narrow this range, we devised a new weighted approach (MCRwt) that employs information from the modern relations between climatic parameters and plant distributions to lessen the influence of the "tails" of the distributions of the climatic data associated with the taxa in an assemblage. To assess the performance of the MCR approaches, we applied them to a set of modern climatic data and plant distributions on a 25-km grid for North America, and compared observed and estimated climatic values for each grid point. In general, MCRwt was superior to MCRun in providing smaller anomalies, less bias, and better correlations between observed and estimated values. However, by the same measures, the results of Modern Analog Technique (MAT) approaches were superior to MCRwt. Although this might be reason to favor MAT approaches, they are based on assumptions that may not be valid for paleoclimatic reconstructions, including that: 1) the absence of a taxon from a fossil sample is meaningful, 2) plant associations were largely unaffected by past changes in either levels of atmospheric carbon dioxide or in the seasonal distributions of solar radiation, and 3) plant associations of the past are adequately represented on the modern landscape. To illustrate the application of these MCR and MAT approaches to paleoclimatic reconstructions, we applied them to a Pleistocene paleobotanical assemblage from the western United States. From our examinations of the estimates of modern and past climates from vegetation assemblages, we conclude that the MCRun technique provides reliable and unbiased estimates of the ranges of possible climatic conditions that can reasonably be associated with these assemblages. The application of MCRwt and MAT approaches can further constrain these estimates and may provide a systematic way to assess uncertainty. The data sets required for MCR analyses in North America are provided in a parallel publication.

  5. Elemental analyses of modern dust in southern Nevada and California

    USGS Publications Warehouse

    Reheis, M.C.; Budahn, J.R.; Lamothe, P.J.

    1999-01-01

    Selected samples of modern dust collected in marble traps at sites in southern Nevada and California (Reheis and Kihl, 1995; Reheis, 1997) have been analyzed for elemental composition using instrumental neutron activation analysis (INAA) and inductively coupled plasma atomic emission spectroscopy (ICP-AES) and inductively coupled plasma mass spectroscopy (ICP-MS). For information on these analytical techniques and their levels of precision and accuracy, refer to Baedecker and McKown (1987) for INAA, to Briggs (1996) for ICP-AES, and to Briggs and Meier (1999) for ICP-MS. This report presents the elemental compositions obtained using these techniques on dust samples collected from 1991 through 1997.The dust-trap sites were established at varying times; some have been maintained since 1984, others since 1991. For details on site location, dust-trap construction, and collection techniques, see Reheis and Kihl (1995) and Reheis (1997). Briefly, the trap consists of a coated angel-food cake pan painted black on the outside and mounted on a post about 2 m above the ground. Glass marbles rest on a circular piece of galvanized hardware cloth (now replaced by stainless-steel mesh), which is fitted into the pan so that it rests 3-4 cm below the rim. The 2-m height eliminates most saltating sand-sized particles. The marbles simulate the effect of a gravelly fan surface and prevent dust that has filtered or washed into the bottom of the pan from being blown back out. The dust traps are fitted with two metal straps looped in an inverted basket shape; the top surfaces of the straps are coated with a sticky material that effectively discourages birds from roosting.

  6. Lightweight composites for modular panelized construction

    NASA Astrophysics Data System (ADS)

    Vaidya, Amol S.

    Rapid advances in construction materials technology have enabled civil engineers to achieve impressive gains in the safety, economy, and functionality of structures built to serve the common needs of society. Modular building systems is a fast-growing modern, form of construction gaining recognition for its increased efficiency and ability to apply modern technology to the needs of the market place. In the modular construction technique, a single structural panel can perform a number of functions such as providing thermal insulation, vibration damping, and structural strength. These multifunctional panels can be prefabricated in a manufacturing facility and then transferred to the construction site. A system that uses prefabricated panels for construction is called a "panelized construction system". This study focuses on the development of pre-cast, lightweight, multifunctional sandwich composite panels to be used for panelized construction. Two thermoplastic composite panels are proposed in this study, namely Composite Structural Insulated Panels (CSIPs) for exterior walls, floors and roofs, and Open Core Sandwich composite for multifunctional interior walls of a structure. Special manufacturing techniques are developed for manufacturing these panels. The structural behavior of these panels is analyzed based on various building design codes. Detailed descriptions of the design, cost analysis, manufacturing, finite element modeling and structural testing of these proposed panels are included in this study in the of form five peer-reviewed journal articles. The structural testing of the proposed panels involved in this study included flexural testing, axial compression testing, and low and high velocity impact testing. Based on the current study, the proposed CSIP wall and floor panels were found satisfactory, based on building design codes ASCE-7-05 and ACI-318-05. Joining techniques are proposed in this study for connecting the precast panels on the construction site. Keywords: Modular panelized construction, sandwich composites, composite structural insulated panels (CSIPs).

  7. Isoquinoline alkaloids and their binding with DNA: calorimetry and thermal analysis applications.

    PubMed

    Bhadra, Kakali; Kumar, Gopinatha Suresh

    2010-11-01

    Alkaloids are a group of natural products with unmatched chemical diversity and biological relevance forming potential quality pools in drug screening. The molecular aspects of their interaction with many cellular macromolecules like DNA, RNA and proteins are being currently investigated in order to evolve the structure activity relationship. Isoquinolines constitute an important group of alkaloids. They have extensive utility in cancer therapy and a large volume of data is now emerging in the literature on their mode, mechanism and specificity of binding to DNA. Thermodynamic characterization of the binding of these alkaloids to DNA may offer key insights into the molecular aspects that drive complex formation and these data can provide valuable information about the balance of driving forces. Various thermal techniques have been conveniently used for this purpose and modern calorimetric instrumentation provides direct and quick estimation of thermodynamic parameters. Thermal melting studies and calorimetric techniques like isothermal titration calorimetry and differential scanning calorimetry have further advanced the field by providing authentic, reliable and sensitive data on various aspects of temperature dependent structural analysis of the interaction. In this review we present the application of various thermal techniques, viz. isothermal titration calorimetry, differential scanning calorimetry and optical melting studies in the characterization of drug-DNA interactions with particular emphasis on isoquinoline alkaloid-DNA interaction.

  8. Single-energy intensity modulated proton therapy

    NASA Astrophysics Data System (ADS)

    Farace, Paolo; Righetto, Roberto; Cianchetti, Marco

    2015-09-01

    In this note, an intensity modulated proton therapy (IMPT) technique, based on the use of high single-energy (SE-IMPT) pencil beams, is described. The method uses only the highest system energy (226 MeV) and only lateral penumbra to produce dose gradient, as in photon therapy. In the study, after a preliminary analysis of the width of proton pencil beam penumbras at different depths, SE-IMPT was compared with conventional IMPT in a phantom containing titanium inserts and in a patient, affected by a spinal chordoma with fixation rods. It was shown that SE-IMPT has the potential to produce a sharp dose gradient and that it is not affected by the uncertainties produced by metal implants crossed by the proton beams. Moreover, in the chordoma patient, target coverage and organ at risk sparing of the SE-IMPT plan resulted comparable to that of the less reliable conventional IMPT technique. Robustness analysis confirmed that SE-IMPT was not affected by range errors, which can drastically affect the IMPT plan. When accepting a low-dose spread as in modern photon techniques, SE-IMPT could be an option for the treatment of lesions (e.g. cervical bone tumours) where steep dose gradient could improve curability, and where range uncertainty, due for example to the presence of metal implants, hampers conventional IMPT.

  9. Single-energy intensity modulated proton therapy.

    PubMed

    Farace, Paolo; Righetto, Roberto; Cianchetti, Marco

    2015-10-07

    In this note, an intensity modulated proton therapy (IMPT) technique, based on the use of high single-energy (SE-IMPT) pencil beams, is described.The method uses only the highest system energy (226 MeV) and only lateral penumbra to produce dose gradient, as in photon therapy. In the study, after a preliminary analysis of the width of proton pencil beam penumbras at different depths, SE-IMPT was compared with conventional IMPT in a phantom containing titanium inserts and in a patient, affected by a spinal chordoma with fixation rods.It was shown that SE-IMPT has the potential to produce a sharp dose gradient and that it is not affected by the uncertainties produced by metal implants crossed by the proton beams. Moreover, in the chordoma patient, target coverage and organ at risk sparing of the SE-IMPT plan resulted comparable to that of the less reliable conventional IMPT technique. Robustness analysis confirmed that SE-IMPT was not affected by range errors, which can drastically affect the IMPT plan.When accepting a low-dose spread as in modern photon techniques, SE-IMPT could be an option for the treatment of lesions (e.g. cervical bone tumours) where steep dose gradient could improve curability, and where range uncertainty, due for example to the presence of metal implants, hampers conventional IMPT.

  10. [Premedication visits in departments of anesthesiology in Hessen. Compilation of organizational and performance portfolios].

    PubMed

    Aust, H; Veltum, B; Wächtershäuser, T; Wulf, H; Eberhart, L

    2014-02-01

    Many anesthesia departments operate a pre-anesthesia assessment clinic (PAAC). Data regarding organization, equipment and structure of such clinics are not yet available. Information about modern anesthesiology techniques and procedures contributes to a reduction in emotional stress of the patients but such modern techniques often require additional technical hardware and costs and are not equally available. This survey examined the current structures of PAAC in the state of Hessen, demonstrated current concepts and associated these with the performance and the portfolio of procedures in these departments. An online survey was carried out. Data on structure, equipment, organization and available methods were compiled. In addition, anesthesia department personnel were asked to give individual subjective attitudes toward the premedication work. Of the anesthesia departments in Hessen 84 % participated in the survey of which 91 % operated a PAAC. A preoperative contact with the anesthesiologist who would perform anesthesia existed in only 19 % of the departments. Multimedia processing concepts for informed consent in a PAAC setting were in general rare. Many modern procedures and anesthesia techniques were broadly established independent of the hospital size. Regarding the individual and subjective attitudes of anesthetists towards the work, the psychological and medical importance of the pre-medication visit was considered to be very high. The PAACs are now well established. This may make economic sense but is accompanied by an anonymization of care in anesthesiology. The high quality, safety and availability of modern anesthesiology procedures and monitoring concepts should be communicated to patients all the more as an expression of trust and high patient safety. These factors can be facilitated in particular by multimedia tools which have as yet only been sparsely implemented in PAACs.

  11. Analysis Methodologies and Ameliorative Techniques for Mitigation of the Risk in Churches with Drum Domes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zingone, Gaetano; Licata, Vincenzo; Calogero, Cucchiara

    2008-07-08

    The present work fits into the interesting theme of seismic prevention for protection of the monumental patrimony made up of churches with drum domes. Specifically, with respect to a church in the historic area of Catania, chosen as a monument exemplifying the typology examined, the seismic behavior is analyzed in the linear field using modern dynamic identification techniques. The dynamically identified computational model arrived at made it possible to identify the macro-element most at risk, the dome-drum system. With respect to this system the behavior in the nonlinear field is analyzed through dynamic tests on large-scale models in the presencemore » of various types of improving reinforcement. The results are used to appraise the ameliorative contribution afforded by each of them and to choose the most suitable type of reinforcement, optimizing the stiffness/ductility ratio of the system.« less

  12. Research on data from the ATLAS experiment at CERN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purohit, Milind V.

    2015-07-31

    In this report senior investigator Prof. Milind V. Purohit describes research done with data from the ATLAS experiment at CERN. This includes preparing papers on the performance of the CSC detector, searches for SUSY using a new modern ''big data'' technique, and a search for supersymmetry (SUSY) using the "zero leptons razor" (0LRaz) technique. The prediction of the W=Z+jets background processes by the ATLAS simulation prior to the fit is found to be overestimated in the phase space of interest. In all new signal regions presented in this analysis the number of events observed is consistent with the post-fit SMmore » expectations. Assuming R-parity conservation, the limit on the gluino mass exceeds 1150 GeV at 95% confidence level, for an LSP mass smaller than 100 GeV. Other USC personnel who participated in this project during the period of this grant were a graduate student, Anton Kravchenko.« less

  13. NASA's program on icing research and technology

    NASA Technical Reports Server (NTRS)

    Reinmann, John J.; Shaw, Robert J.; Ranaudo, Richard J.

    1989-01-01

    NASA's program in aircraft icing research and technology is reviewed. The program relies heavily on computer codes and modern applied physics technology in seeking icing solutions on a finer scale than those offered in earlier programs. Three major goals of this program are to offer new approaches to ice protection, to improve our ability to model the response of an aircraft to an icing encounter, and to provide improved techniques and facilities for ground and flight testing. This paper reviews the following program elements: (1) new approaches to ice protection; (2) numerical codes for deicer analysis; (3) measurement and prediction of ice accretion and its effect on aircraft and aircraft components; (4) special wind tunnel test techniques for rotorcraft icing; (5) improvements of icing wind tunnels and research aircraft; (6) ground de-icing fluids used in winter operation; (7) fundamental studies in icing; and (8) droplet sizing instruments for icing clouds.

  14. Ethnographies of pain: culture, context and complexity

    PubMed Central

    2015-01-01

    This article briefly introduces and discusses the value of ethnographic research, particularly research hailing from the discipline of social and cultural anthropology. After an introduction to ethnography in general, key ethnographic studies of pain are described. These show that ethnography provides a set of techniques for data collection and analysis, as well as a way of thinking about pain as socially and culturally embedded. Modern ethnographic writing is far removed from the literature of the past that sometimes described stereotypes rather than process and complexity. Ethnography provides the chance to describe the complexity and nuance of culture, which serves to counter stereotypes. The article concludes with an example from pain research conducted in a clinical setting. Through the use of ethnographic techniques, the research study provided greater insight than other methods alone might have achieved. The article includes references for further reading should readers be interested in developing their engagement with ethnographic method and interpretation. PMID:26516554

  15. Ridge Regression Signal Processing

    NASA Technical Reports Server (NTRS)

    Kuhl, Mark R.

    1990-01-01

    The introduction of the Global Positioning System (GPS) into the National Airspace System (NAS) necessitates the development of Receiver Autonomous Integrity Monitoring (RAIM) techniques. In order to guarantee a certain level of integrity, a thorough understanding of modern estimation techniques applied to navigational problems is required. The extended Kalman filter (EKF) is derived and analyzed under poor geometry conditions. It was found that the performance of the EKF is difficult to predict, since the EKF is designed for a Gaussian environment. A novel approach is implemented which incorporates ridge regression to explain the behavior of an EKF in the presence of dynamics under poor geometry conditions. The basic principles of ridge regression theory are presented, followed by the derivation of a linearized recursive ridge estimator. Computer simulations are performed to confirm the underlying theory and to provide a comparative analysis of the EKF and the recursive ridge estimator.

  16. Proteomics of Plant Pathogenic Fungi

    PubMed Central

    González-Fernández, Raquel; Prats, Elena; Jorrín-Novo, Jesús V.

    2010-01-01

    Plant pathogenic fungi cause important yield losses in crops. In order to develop efficient and environmental friendly crop protection strategies, molecular studies of the fungal biological cycle, virulence factors, and interaction with its host are necessary. For that reason, several approaches have been performed using both classical genetic, cell biology, and biochemistry and the modern, holistic, and high-throughput, omic techniques. This work briefly overviews the tools available for studying Plant Pathogenic Fungi and is amply focused on MS-based Proteomics analysis, based on original papers published up to December 2009. At a methodological level, different steps in a proteomic workflow experiment are discussed. Separate sections are devoted to fungal descriptive (intracellular, subcellular, extracellular) and differential expression proteomics and interactomics. From the work published we can conclude that Proteomics, in combination with other techniques, constitutes a powerful tool for providing important information about pathogenicity and virulence factors, thus opening up new possibilities for crop disease diagnosis and crop protection. PMID:20589070

  17. Proteomics of plant pathogenic fungi.

    PubMed

    González-Fernández, Raquel; Prats, Elena; Jorrín-Novo, Jesús V

    2010-01-01

    Plant pathogenic fungi cause important yield losses in crops. In order to develop efficient and environmental friendly crop protection strategies, molecular studies of the fungal biological cycle, virulence factors, and interaction with its host are necessary. For that reason, several approaches have been performed using both classical genetic, cell biology, and biochemistry and the modern, holistic, and high-throughput, omic techniques. This work briefly overviews the tools available for studying Plant Pathogenic Fungi and is amply focused on MS-based Proteomics analysis, based on original papers published up to December 2009. At a methodological level, different steps in a proteomic workflow experiment are discussed. Separate sections are devoted to fungal descriptive (intracellular, subcellular, extracellular) and differential expression proteomics and interactomics. From the work published we can conclude that Proteomics, in combination with other techniques, constitutes a powerful tool for providing important information about pathogenicity and virulence factors, thus opening up new possibilities for crop disease diagnosis and crop protection.

  18. Surface analysis of stone and bone tools

    NASA Astrophysics Data System (ADS)

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.

    2016-03-01

    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  19. A Case of Problematic Diffusion: The Use of Sex Determination Techniques in India.

    ERIC Educational Resources Information Center

    Luthra, Rashmi

    1994-01-01

    Discussion of model shifts in diffusion research focuses on the growth in the use of sex determination techniques in India and their consequences relating to gender and power. Topics addressed include development, underdevelopment, and modernization; the adoption of innovations; and meanings of innovations within particular social systems.…

  20. PSYOP and Persuasion: Applying Social Psychology and Becoming an Informed Citizen

    ERIC Educational Resources Information Center

    King, Sara B.

    2004-01-01

    This project teaches students about persuasion techniques, especially as governments use them. Most project examples came from the work of the U.S. military's modern Psychological Operations division. Social psychology students (a) reviewed influence techniques; (b) examined posters, leaflets, and other persuasion tools used in World War II, the…

  1. Modern Language Classroom Techniques. A Handbook.

    ERIC Educational Resources Information Center

    Allen, Edward David; Valette, Rebecca M.

    The aim of this handbook is to show the teacher ways of implementing and supplementing existing materials. The suggested teaching procedures may be used in classes of varying sizes and levels, and with any method. Although the emphasis is on teacher-made materials, many of the techniques suggested may be implemented with commercial programs,…

  2. Embedding Mixed-Reality Laboratories into E-Learning Systems for Engineering Education

    ERIC Educational Resources Information Center

    Al-Tikriti, Munther N.; Al-Aubidy, Kasim M.

    2013-01-01

    E-learning, virtual learning and mixed reality techniques are now a global integral part of the academic and educational systems. They provide easier access to educational opportunities to a very wide spectrum of individuals to pursue their educational and qualification objectives. These modern techniques have the potentials to improve the quality…

  3. Eukaryotic cell flattening

    NASA Astrophysics Data System (ADS)

    Bae, Albert; Westendorf, Christian; Erlenkamper, Christoph; Galland, Edouard; Franck, Carl; Bodenschatz, Eberhard; Beta, Carsten

    2010-03-01

    Eukaryotic cell flattening is valuable for improving microscopic observations, ranging from bright field to total internal reflection fluorescence microscopy. In this talk, we will discuss traditional overlay techniques, and more modern, microfluidic based flattening, which provides a greater level of control. We demonstrate these techniques on the social amoebae Dictyostelium discoideum, comparing the advantages and disadvantages of each method.

  4. Rapid purification of fluorescent enzymes by ultrafiltration

    NASA Technical Reports Server (NTRS)

    Benjaminson, M. A.; Satyanarayana, T.

    1983-01-01

    In order to expedite the preparation of fluorescently tagged enzymes for histo-cyctochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.

  5. Rapid purification of fluorescent enzymes by ultrafiltration

    NASA Technical Reports Server (NTRS)

    Benjaminson, M. A.; Satyanarayana, T.

    1983-01-01

    In order to expedite the preparation of fluorescently tagged enzymes for histo/cytochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.

  6. The Throws: Contemporary Theory, Technique and Training.

    ERIC Educational Resources Information Center

    Wilt, Fred, Ed.

    This compilation of articles covers the subject of four throwing events--shot put, discus throw, hammer throw, and javelin throw. The history of the art and science of throwing is traced from ancient to modern times in the introduction. Theories on training and techniques of throwing are presented in essays contributed by coaches from the United…

  7. Technologies of Student Testing for Learning Quality Evaluation in the System of Higher Education

    ERIC Educational Resources Information Center

    Bayukova, Nadezhda Olegovna; Kareva, Ludmila Alexandrovna; Rudometova, Liliya Tarasovna; Shlangman, Marina Konstantinovna; Yarantseva, Natalia Vladislavovna

    2015-01-01

    The paper deals with technology of students' achievement in the area of educational activities, methods, techniques, forms and conditions of monitoring knowledge quality in accordance with the requirements of Russian higher education system modernization. The authors propose methodic techniques of students' training for testing based on innovative…

  8. Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course

    ERIC Educational Resources Information Center

    Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil

    2016-01-01

    A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…

  9. Seismic Source Identification Techniques

    DTIC Science & Technology

    various fields of endeavor in theoretical and experimental seismology and the establishment of a modern geophysical observatory near Eilat, Israel, which includes strainmeters, tiltmeters and high-gain displacement-meters.

  10. Methods of applied dynamics

    NASA Technical Reports Server (NTRS)

    Rheinfurth, M. H.; Wilson, H. B.

    1991-01-01

    The monograph was prepared to give the practicing engineer a clear understanding of dynamics with special consideration given to the dynamic analysis of aerospace systems. It is conceived to be both a desk-top reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject. Beginning with the basic concepts of kinematics and dynamics, the discussion proceeds to treat the dynamics of a system of particles. Both classical and modern formulations of the Lagrange equations, including constraints, are discussed and applied to the dynamic modeling of aerospace structures using the modal synthesis technique.

  11. Closed-Loop Analysis of Soft Decisions for Serial Links

    NASA Technical Reports Server (NTRS)

    Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlensinger, Adam M.

    2012-01-01

    Modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more overhead through noisier channels, and software-defined radios use error-correction techniques that approach Shannon s theoretical limit of performance. The authors describe the benefit of closed-loop measurements for a receiver when paired with a counterpart transmitter and representative channel conditions. We also describe a real-time Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in real-time during the development of software defined radios.

  12. A numerical analysis of the aortic blood flow pattern during pulsed cardiopulmonary bypass.

    PubMed

    Gramigna, V; Caruso, M V; Rossi, M; Serraino, G F; Renzulli, A; Fragomeni, G

    2015-01-01

    In the modern era, stroke remains a main cause of morbidity after cardiac surgery despite continuing improvements in the cardiopulmonary bypass (CPB) techniques. The aim of the current work was to numerically investigate the blood flow in aorta and epiaortic vessels during standard and pulsed CPB, obtained with the intra-aortic balloon pump (IABP). A multi-scale model, realized coupling a 3D computational fluid dynamics study with a 0D model, was developed and validated with in vivo data. The presence of IABP improved the flow pattern directed towards the epiaortic vessels with a mean flow increase of 6.3% and reduced flow vorticity.

  13. 3D-printed and CNC milled flow-cells for chemiluminescence detection.

    PubMed

    Spilstead, Kara B; Learey, Jessica J; Doeven, Egan H; Barbante, Gregory J; Mohr, Stephan; Barnett, Neil W; Terry, Jessica M; Hall, Robynne M; Francis, Paul S

    2014-08-01

    Herein we explore modern fabrication techniques for the development of chemiluminescence detection flow-cells with features not attainable using the traditional coiled tubing approach. This includes the first 3D-printed chemiluminescence flow-cells, and a milled flow-cell designed to split the analyte stream into two separate detection zones within the same polymer chip. The flow-cells are compared to conventional detection systems using flow injection analysis (FIA) and high performance liquid chromatography (HPLC), with the fast chemiluminescence reactions of an acidic potassium permanganate reagent with morphine and a series of adrenergic phenolic amines. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Modern digital flight control system design for VTOL aircraft

    NASA Technical Reports Server (NTRS)

    Broussard, J. R.; Berry, P. W.; Stengel, R. F.

    1979-01-01

    Methods for and results from the design and evaluation of a digital flight control system (DFCS) for a CH-47B helicopter are presented. The DFCS employed proportional-integral control logic to provide rapid, precise response to automatic or manual guidance commands while following conventional or spiral-descent approach paths. It contained altitude- and velocity-command modes, and it adapted to varying flight conditions through gain scheduling. Extensive use was made of linear systems analysis techniques. The DFCS was designed, using linear-optimal estimation and control theory, and the effects of gain scheduling are assessed by examination of closed-loop eigenvalues and time responses.

  15. Propulsion system/flight control integration for supersonic aircraft

    NASA Technical Reports Server (NTRS)

    Reukauf, P. J.; Burcham, F. W., Jr.

    1976-01-01

    Digital integrated control systems are studied. Such systems allow minimization of undesirable interactions while maximizing performance at all flight conditions. One such program is the YF-12 cooperative control program. The existing analog air data computer, autothrottle, autopilot, and inlet control systems are converted to digital systems by using a general purpose airborne computer and interface unit. Existing control laws are programed and tested in flight. Integrated control laws, derived using accurate mathematical models of the airplane and propulsion system in conjunction with modern control techniques, are tested in flight. Analysis indicates that an integrated autothrottle autopilot gives good flight path control and that observers are used to replace failed sensors.

  16. The 'relics of Joan of Arc': a forensic multidisciplinary analysis.

    PubMed

    Charlier, P; Poupon, J; Eb, A; De Mazancourt, P; Gilbert, T; Huynh-Charlier, I; Loublier, Y; Verhille, A M; Moulheirat, C; Patou-Mathis, M; Robbiola, L; Montagut, R; Masson, F; Etcheberry, A; Brun, L; Willerslev, E; de la Grandmaison, G Lorin; Durigon, M

    2010-01-30

    Archaeological remains can provide concrete cases, making it possible to develop, refine or validate medico-legal techniques. In the case of the so-called 'Joan of Arc's relics' (a group of bone and archaeological remains known as the 'Bottle of Chinon'), 14 specialists analysed the samples such as a cadaver X of carbonised aspect: forensic anthropologist, medical examiners, pathologists, geneticists, radiologist, biochemists, palynologists, zoologist and archaeologist. Materials, methods and results of this study are presented here. This study aims to offer an exploitable methodology for the modern medico-legal cases of small quantities of human bones of carbonised aspect. 2009 Elsevier Ireland Ltd. All rights reserved.

  17. The first modern Europeans.

    PubMed

    Benazzi, Stefano

    2012-01-01

    The discovery of new human fossil remains is one of the most obvious ways to improve our understanding of the dynamics of human evolution. The reanalysis of existing fossils using newer methods is also crucial, and may lead to a reconsideration of the biological and taxonomical status of some specimens, and improve our understanding of highly debated periods in human prehistory. This is particularly true for those remains that have previously been studied using traditional approaches, with only morphological descriptions and standard calliper measurements available. My own interest in the Uluzzian, and its associated human remains grew from my interest in applying recently developed analytical techniques to quantify morphological variation. Discovered more than 40 years ago, the two deciduous molars from Grotta del Cavallo (Apulia, Italy) are the only human remains associated with the Uluzzian culture (one of the main three European "transitional" cultures). These teeth were previously attributed to Neanderthals. This attribution contributed to a consensus view that the Uluzzian, with its associated ornament and tool complexes, was produced by Neanderthals. A reassessment of these deciduous teeth by means of digital morphometric analysis revealed that these remains belong to anatomically modern humans (AMHs). This finding contradicts previous assumptions and suggests that modern humans, and not Neanderthals, created the Uluzzian culture. Of equal importance, new chronometric analyses date these dental remains to 43,000-45,000 cal BP. Thus, the teeth from Grotta del Cavallo represent the oldest European AMH currently known.

  18. Differentiation of modern and ancient varieties of common wheat by quantitative capillary electrophoretic profile of phenolic acids.

    PubMed

    Gotti, Roberto; Amadesi, Elisa; Fiori, Jessica; Bosi, Sara; Bregola, Valeria; Marotti, Ilaria; Dinelli, Giovanni

    2018-01-12

    Phenolic compounds have received great attention among the health promoting phytochemicals in common wheat (Triticum aestivum L.), mainly because of their strong antioxidant properties. In the present study a simple Capillary Zone Electrophoresis (CZE) method with UV detection was optimized and validated for the quantitation of six of the most important phenolic acids in whole grain i.e., sinapic, ferulic, syringic, p-coumaric, vanillic and p-hydroxybenzoic acid. The separation was achieved in a running buffer composed of sodium phosphate solution (50 mM) in water/methanol 80:20 (v/v) at pH 6.0 and using a fused-silica capillary at the temperature of 30 °C under application of 27 kV. By means of diode array detector, and made possible by the favorable characteristic UV spectra, the quantitation of the solutes was carried out at 200, 220 and 300 nm, in the complex matrices represented by the soluble and bound fractions of wheat flours. The validation parameters of the method i.e., linearity, sensitivity, precision, accuracy and robustness were in line with those obtained by consolidated separation techniques applied for the same purposes (e.g., HPLC-UV), with a significant advantage in term of analysis time (less than 12 min). Ten varieties of soft wheat (five modern Italian and five old Italian genotypes) were analysed and the data were subjected to Principal Components Analysis (PCA). Interestingly, significant differences of the quantitative phenolic acids profile were observed between the modern and the ancient genotypes, with the latter showing higher amount of the main represented phenolic acids. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henneke, Dennis W.; Robinson, James

    In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less

  20. Effective approach to spectroscopy and spectral analysis techniques using Matlab

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Lv, Yong

    2017-08-01

    With the development of electronic information, computer and network, modern education technology has entered new era, which would give a great impact on teaching process. Spectroscopy and spectral analysis is an elective course for Optoelectronic Information Science and engineering. The teaching objective of this course is to master the basic concepts and principles of spectroscopy, spectral analysis and testing of basic technical means. Then, let the students learn the principle and technology of the spectrum to study the structure and state of the material and the developing process of the technology. MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment and fourth-generation programming language. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, Based on the teaching practice, this paper summarizes the new situation of applying Matlab to the teaching of spectroscopy. This would be suitable for most of the current school multimedia assisted teaching

  1. [Sanitation of the health service centre in Warsaw (Samodzielny Zespół Publicznych Zakładów Lecznictwa Otwartego Warszawa-Mokotów). Financial and economic analysis].

    PubMed

    Buczak-Stec, Elzbieta

    2010-01-01

    The aim of the financial and economic analysis, conducted in March 2010, was to identify all significant factors that had a positive influence on the restructuring process in the health service centre (Samodzielny Zespół Publicznych Zakładów Lecznictwa Otwartego Warszawa--Mokotów) in Warsaw. Within the framework of the analysis, financial data form time period 1999-2009 were analyzed. Also the managing director and financial director were interviewed. Taking into consideration research results it can be stated that not a single factor but a collection of the purposeful efforts influenced the improvement of the health service centre condition. Apart from received public help, the most significant factors include: rational restructuring process, managing of personnel development, professionally managed financial department, cooperation between departments, good internal communication and use of modern management techniques.

  2. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  3. Control of flexible structures

    NASA Technical Reports Server (NTRS)

    Russell, R. A.

    1985-01-01

    The requirements for future space missions indicate that many of these spacecraft will be large, flexible, and in some applications, require precision geometries. A technology program that addresses the issues associated with the structure/control interactions for these classes of spacecraft is discussed. The goal of the NASA control of flexible structures technology program is to generate a technology data base that will provide the designer with options and approaches to achieve spacecraft performance such as maintaining geometry and/or suppressing undesired spacecraft dynamics. This technology program will define the appropriate combination of analysis, ground testing, and flight testing required to validate the structural/controls analysis and design tools. This work was motivated by a recognition that large minimum weight space structures will be required for many future missions. The tools necessary to support such design included: (1) improved structural analysis; (2) modern control theory; (3) advanced modeling techniques; (4) system identification; and (5) the integration of structures and controls.

  4. Non-linear optical flow cytometry using a scanned, Bessel beam light-sheet.

    PubMed

    Collier, Bradley B; Awasthi, Samir; Lieu, Deborah K; Chan, James W

    2015-05-29

    Modern flow cytometry instruments have become vital tools for high-throughput analysis of single cells. However, as issues with the cellular labeling techniques often used in flow cytometry have become more of a concern, the development of label-free modalities for cellular analysis is increasingly desired. Non-linear optical phenomena (NLO) are of growing interest for label-free analysis because of the ability to measure the intrinsic optical response of biomolecules found in cells. We demonstrate that a light-sheet consisting of a scanned Bessel beam is an optimal excitation geometry for efficiently generating NLO signals in a microfluidic environment. The balance of photon density and cross-sectional area provided by the light-sheet allowed significantly larger two-photon fluorescence intensities to be measured in a model polystyrene microparticle system compared to measurements made using other excitation focal geometries, including a relaxed Gaussian excitation beam often used in conventional flow cytometers.

  5. The Fourier analysis of biological transients.

    PubMed

    Harris, C M

    1998-08-31

    With modern computing technology the digital implementation of the Fourier transform is widely available, mostly in the form of the fast Fourier transform (FFT). Although the FFT has become almost synonymous with the Fourier transform, it is a fast numerical technique for computing the discrete Fourier transform (DFT) of a finite sequence of sampled data. The DFT is not directly equivalent to the continuous Fourier transform of the underlying biological signal, which becomes important when analyzing biological transients. Although this distinction is well known by some, for many it leads to confusion in how to interpret the FFT of biological data, and in how to precondition data so as to yield a more accurate Fourier transform using the FFT. We review here the fundamentals of Fourier analysis with emphasis on the analysis of transient signals. As an example of a transient, we consider the human saccade to illustrate the pitfalls and advantages of various Fourier analyses.

  6. On-call service of neurosurgeons in Germany: organization, use of communication services, and personal acceptance of modern technologies.

    PubMed

    Brenke, Christopher; Lassel, Elke A; Terris, Darcey; Kurt, Aysel; Schmieder, Kirsten; Schoenberg, Stefan O; Weisser, Gerald

    2014-05-01

    A significant proportion of acute care neurosurgical patients present to hospital outside regular working hours. The objective of our study was to evaluate the structure of neurosurgical on-call services in Germany, the use of modern communication devices and teleradiology services, and the personal acceptance of modern technologies by neurosurgeons. A nationwide survey of all 141 neurosurgical departments in Germany was performed. The questionnaire consisted of two parts: one for neurosurgical departments and one for individual neurosurgeons. The questionnaire, available online and mailed in paper form, included 21 questions about on-call service structure; the availability and use of communication devices, teleradiology services, and other information services; and neurosurgeons' personal acceptance of modern technologies. The questionnaire return rate from departments was 63.1% (89/141), whereas 187 individual neurosurgeons responded. For 57.3% of departments, teleradiology services were available and were frequently used by 62.2% of neurosurgeons. A further 23.6% of departments described using smartphone screenshots of computed tomography (CT) images transmitted by multimedia messaging service (MMS), and 8.6% of images were described as sent by unencrypted email. Although 47.0% of neurosurgeons reported owning a smartphone, only 1.1% used their phone for on-call image communication. Teleradiology services were observed to be widely used by on-call neurosurgeons in Germany. Nevertheless, a significant number of departments appear to use outdated techniques or techniques that leave patient data unprotected. On-call neurosurgeons in Germany report a willingness to adopt more modern approaches, utilizing readily available smartphones or tablet technology. Georg Thieme Verlag KG Stuttgart · New York.

  7. Documenting Modern Mexican Architectural Heritage for Posterity: Barragan's Casa Cristo, in Guadalajara, Mexico

    NASA Astrophysics Data System (ADS)

    Mezzino, D.; Pei, W.; Santana Quintero, M.; Reyes Rodriguez, R.

    2015-08-01

    This contribution describes the results of an International workshop on documentation of historic and cultural heritage developed jointly between Universidad de Guadalajara's Centro Universitario de Arte, Arquitectura y Diseño (CUAAD) and Carleton University's Architectural Conservation and Sustainability Program. The objective of the workshop was to create a learning environment for emerging heritage professionals through the use of advanced recording techniques for the documentation of modern architectural heritage in Guadalajara, Mexico. The selected site was Casa Cristo, one of the several architectural projects by Luis Barragán in Guadalajara. The house was built between 1927 and 1929 for Gustavo R. Cristo, mayor of the city. The style of the building reflects the European influences derived from the architect's travel experience, as well as the close connection with local craftsmanship. All of these make the house an outstanding example of modern regional architecture. A systematic documentation strategy was developed for the site, using different survey equipment and techniques to capture the shape, colour, spatial configuration, and current conditions of Casa Cristo for its eventual rehabilitation and conservation.

  8. Incorporating modern neuroscience findings to improve brain-computer interfaces: tracking auditory attention.

    PubMed

    Wronkiewicz, Mark; Larson, Eric; Lee, Adrian Kc

    2016-10-01

    Brain-computer interface (BCI) technology allows users to generate actions based solely on their brain signals. However, current non-invasive BCIs generally classify brain activity recorded from surface electroencephalography (EEG) electrodes, which can hinder the application of findings from modern neuroscience research. In this study, we use source imaging-a neuroimaging technique that projects EEG signals onto the surface of the brain-in a BCI classification framework. This allowed us to incorporate prior research from functional neuroimaging to target activity from a cortical region involved in auditory attention. Classifiers trained to detect attention switches performed better with source imaging projections than with EEG sensor signals. Within source imaging, including subject-specific anatomical MRI information (instead of using a generic head model) further improved classification performance. This source-based strategy also reduced accuracy variability across three dimensionality reduction techniques-a major design choice in most BCIs. Our work shows that source imaging provides clear quantitative and qualitative advantages to BCIs and highlights the value of incorporating modern neuroscience knowledge and methods into BCI systems.

  9. Socio-cultural inhibitors to use of modern contraceptive techniques in rural Uganda: a qualitative study.

    PubMed

    Kabagenyi, Allen; Reid, Alice; Ntozi, James; Atuyambe, Lynn

    2016-01-01

    Family planning is one of the cost-effective strategies in reducing maternal and child morbidity and mortality rates. Yet in Uganda, the contraceptive prevalence rate is only 30% among married women in conjunction with a persistently high fertility rate of 6.2 children per woman. These demographic indicators have contributed to a high population growth rate of over 3.2% annually. This study examines the role of socio-cultural inhibitions in the use of modern contraceptives in rural Uganda. This was a qualitative study conducted in 2012 among men aged 15-64 and women aged 15-49 in the districts of Mpigi and Bugiri in rural Uganda. Eighteen selected focus group discussions (FGDs), each internally homogeneous, and eight in-depth interviews (IDIs) were conducted among men and women. Data were collected on sociocultural beliefs and practices, barriers to modern contraceptive use and perceptions of and attitudes to contraceptive use. All interviews were tape recoded, translated and transcribed verbatim. All the transcripts were coded, prearranged into categories and later analyzed using a latent content analysis approach, with support of ATLAS.ti qualitative software. Suitable quotations were used to provide in-depth explanations of the findings. Three themes central in hindering the uptake of modern contraceptives emerged: (i) persistence of socio-cultural beliefs and practices promoting births (such as polygamy, extending family lineage, replacement of the dead, gender-based violence, power relations and twin myths). (ii) Continued reliance on traditional family planning practices and (iii) misconceptions and fears about modern contraception. Sociocultural expectations and values attached to marriage, women and child bearing remain an impediment to using family planning methods. The study suggests a need to eradicate the cultural beliefs and practices that hinder people from using contraceptives, as well as a need to scale-up family planning services and sensitization at the grassroots.

  10. Socio-cultural inhibitors to use of modern contraceptive techniques in rural Uganda: a qualitative study

    PubMed Central

    Kabagenyi, Allen; Reid, Alice; Ntozi, James; Atuyambe, Lynn

    2016-01-01

    Introduction Family planning is one of the cost-effective strategies in reducing maternal and child morbidity and mortality rates. Yet in Uganda, the contraceptive prevalence rate is only 30% among married women in conjunction with a persistently high fertility rate of 6.2 children per woman. These demographic indicators have contributed to a high population growth rate of over 3.2% annually. This study examines the role of socio-cultural inhibitions in the use of modern contraceptives in rural Uganda. Methods This was a qualitative study conducted in 2012 among men aged 15-64 and women aged 15-49 in the districts of Mpigi and Bugiri in rural Uganda. Eighteen selected focus group discussions (FGDs), each internally homogeneous, and eight in-depth interviews (IDIs) were conducted among men and women. Data were collected on sociocultural beliefs and practices, barriers to modern contraceptive use and perceptions of and attitudes to contraceptive use. All interviews were tape recoded, translated and transcribed verbatim. All the transcripts were coded, prearranged into categories and later analyzed using a latent content analysis approach, with support of ATLAS.ti qualitative software. Suitable quotations were used to provide in-depth explanations of the findings. Results Three themes central in hindering the uptake of modern contraceptives emerged: (i) persistence of socio-cultural beliefs and practices promoting births (such as polygamy, extending family lineage, replacement of the dead, gender-based violence, power relations and twin myths). (ii) Continued reliance on traditional family planning practices and (iii) misconceptions and fears about modern contraception. Conclusion Sociocultural expectations and values attached to marriage, women and child bearing remain an impediment to using family planning methods. The study suggests a need to eradicate the cultural beliefs and practices that hinder people from using contraceptives, as well as a need to scale-up family planning services and sensitization at the grassroots. PMID:28292041

  11. [Research on compatibility of prescriptions including Ginseng Radix et Rhizoma and Trogopterus Dung based on complex network analysis].

    PubMed

    Li, Meng-Wen; Fan, Xin-Sheng; Zhang, Ling-Shan; Wang, Cong-Jun

    2017-09-01

    The applications of prescriptions including Ginseng Radix et Rhizoma and Trogopterus Dung in contemporary literatures from 1949 to 2016 are compiled and the data mining techniques containing scale-free complex network method are utilized to explore its practical characteristics, with comparison between modern and ancient ones. The results indicate that malignant neoplasms, coronary heart disease which present Qi deficiency and blood stasis type are the main diseases treated by prescriptions including Ginseng Radix et Rhizoma and Trogopterus Dung according to the reports during 1949 to 2016. The complex network connection shows that Glycyrrhizae Radixet Rhizoma, Angelicae Sinensis Radix, Astragali Radix, Typhae Pollen, Salviae Miltiorrhizae Radix et Rhizoma are the primary drugs related to Ginseng Radix et Rhizoma and Trogopterus Dung. The next are Paeoniae Radix Alba, Atractylodis Macrocephalae Rhizoma, Persicae Semen, Foria, et al. Carthami Flos, Notoginseng Radix et Rhizoma, Cyperi Rhizoma, Bupleuri Radix are the peripheral ones. Also, Ginseng Radix et Rhizoma-Glycyrrhizae Radixet Rhizoma, Trogopterus Dung-Glycyrrhizae Radixet Rhizoma, Ginseng Radix et Rhizoma-Angelicae Sinensis Radix, Trogopterus Dung-Angelicae Sinensis Radix, Ginseng Radix et Rhizoma-Astragali Radix, Trogopterus Dung-Astragali Radix are the main paired drugs. The paired drugs including Ginseng Radix et Rhizoma-Trogopterus Dung-Glycyrrhizae Radixet Rhizoma, Ginseng Radix et Rhizoma-Trogopterus Dung-Angelicae Sinensis Radix, Ginseng Radix et Rhizoma-Trogopterus Dung-Astragali Radix, Ginseng Radix et Rhizoma-Trogopterus Dung-Typhae Pollen have a higher support degree. The main compatible drugs are different in ancient and modern prescriptions including Ginseng Radix et Rhizoma and Trogopterus Dung. Notoginseng Radix et Rhizoma, Typhae Pollen, Salviae Miltiorrhizae Radix et Rhizoma, Astragali Radix are utilized frequently in modern prescriptions while less used in ancient ones. It is also shown that more attentions are paid to the drugs contributing to invigorating Qi and promoting blood circulation in modern times with comparative results between modern and ancient prescriptions. Copyright© by the Chinese Pharmaceutical Association.

  12. Identification and imaging of modern paints using Secondary Ion Mass Spectrometry with MeV ions

    NASA Astrophysics Data System (ADS)

    Bogdanović Radović, Iva; Siketić, Zdravko; Jembrih-Simbürger, Dubravka; Marković, Nikola; Anghelone, Marta; Stoytschew, Valentin; Jakšić, Milko

    2017-09-01

    Secondary Ion Mass Spectrometry using MeV ion excitation was applied to analyse modern paint materials containing synthetic organic pigments and binders. It was demonstrated that synthetic organic pigments and binder components with molecular masses in the m/z range from 1 to 1200 could be identified in different paint samples with a high efficiency and in a single measurement. Different ways of mounting of mostly insulating paint samples were tested prior to the analysis in order to achieve the highest possible yield of pigment main molecular ions. As Time-of-Flight mass spectrometer for MeV Secondary Ion Mass Spectrometry is attached to the heavy ion microprobe, molecular imaging on cross-sections of small paint fragments was performed using focused ions. Due to the fact that molecules are extracted from the uppermost layer of the sample and to avoid surface contamination, the paint samples were not embedded in the resin as is usually done when imaging of paint samples using different techniques in the field of cultural heritage.

  13. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  14. Correlative weighted stacking for seismic data in the wavelet domain

    USGS Publications Warehouse

    Zhang, S.; Xu, Y.; Xia, J.; ,

    2004-01-01

    Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.

  15. A Survey of the Teaching and Learning of Modern Foreign Languages in a Sample of Inner City and Urban Schools, Spring Term 1989. A Report by HMI.

    ERIC Educational Resources Information Center

    Department of Education and Science, London (England).

    A study investigated techniques and practices for teaching second languages (French, German, Spanish) in 25 urban schools in different areas of England. It was found that the overall quality of work in modern languages was very good in 1 school, good in 5, satisfactory in 7, less than satisfactory in 10, and poor in 2. Three of 10 lessons seen…

  16. A New Approach to Business Writing.

    ERIC Educational Resources Information Center

    Egan, Michael

    1998-01-01

    Explains how business writing can be taught using examples from modern literature and the analytical tools of literary criticism. Uses Michener, Hemingway, Faulkner, and Steinbeck to illustrate techniques. (SK)

  17. Normative Value Conceptions of Modern Parents, Teachers, and Educators (Analysis of Moral Value Judgments)

    ERIC Educational Resources Information Center

    Shelina, S. L.; Mitina, O. V.

    2015-01-01

    The article presents the results of an analysis of the moral value judgments of adults (parents, teachers, educators) that directly concern the socialization process of the young generation in the modern metropolis. This paper follows the model study by Jean Piaget that investigated the moral value judgments of children. A comparative analysis of…

  18. Applications of hyperspectral imaging in chicken meat safety and quality detection and evaluation: a review.

    PubMed

    Xiong, Zhenjie; Xie, Anguo; Sun, Da-Wen; Zeng, Xin-An; Liu, Dan

    2015-01-01

    Currently, the issue of food safety and quality is a great public concern. In order to satisfy the demands of consumers and obtain superior food qualities, non-destructive and fast methods are required for quality evaluation. As one of these methods, hyperspectral imaging (HSI) technique has emerged as a smart and promising analytical tool for quality evaluation purposes and has attracted much interest in non-destructive analysis of different food products. With the main advantage of combining both spectroscopy technique and imaging technique, HSI technique shows a convinced attitude to detect and evaluate chicken meat quality objectively. Moreover, developing a quality evaluation system based on HSI technology would bring economic benefits to the chicken meat industry. Therefore, in recent years, many studies have been conducted on using HSI technology for the safety and quality detection and evaluation of chicken meat. The aim of this review is thus to give a detailed overview about HSI and focus on the recently developed methods exerted in HSI technology developed for microbiological spoilage detection and quality classification of chicken meat. Moreover, the usefulness of HSI technique for detecting fecal contamination and bone fragments of chicken carcasses are presented. Finally, some viewpoints on its future research and applicability in the modern poultry industry are proposed.

  19. Two-Person Control: A Brief History and Modern Industry Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedersen, Robert Douglas

    Physical asset protection is the principal objective of many security and safeguard measures. One well-known means of asset protection is two-person control. This paper reviews literature regarding two-person control to gain insight into its origin, first demonstrated uses, and its presence in several modern industries. This literature review of two-person control is intended to benefit people and organizations with a desire to understand its origins and how the practice has evolved over time, as well as give some insight into the flexibility of this safeguarding technique. The literature review is focused in four main sections: (1) defining two-person control, (2)more » early history, (3) two-person control in modern industry, and (4) a theory on how two- person control entered modern industry. ACKNOWLEDGEMENTS The author would like to thank Jarret Lafleur and Scott Paap of Sandia National Laboratories, California's Systems Analysis & Engineering organization for the opportunity to work on this project. Jarret Lafleur provided very constructive and helpful feedback through all stages of the work. Amanda Thompson of the Sandia California Technical Library maintained a great spirit and always had a quick document turnaround that very much helped out this project's completion. Additionally, yet perhaps most importantly, the author would like to thank his wife and daughter, along with the rest of his family, for continued support over the years. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.« less

  20. A modern approach to the authentication and quality assessment of thyme using UV spectroscopy and chemometric analysis.

    PubMed

    Gad, Haidy A; El-Ahmady, Sherweit H; Abou-Shoer, Mohamed I; Al-Azizi, Mohamed M

    2013-01-01

    Recently, the fields of chemometrics and multivariate analysis have been widely implemented in the quality control of herbal drugs to produce precise results, which is crucial in the field of medicine. Thyme represents an essential medicinal herb that is constantly adulterated due to its resemblance to many other plants with similar organoleptic properties. To establish a simple model for the quality assessment of Thymus species using UV spectroscopy together with known chemometric techniques. The success of this model may also serve as a technique for the quality control of other herbal drugs. The model was constructed using 30 samples of authenticated Thymus vulgaris and challenged with 20 samples of different botanical origins. The methanolic extracts of all samples were assessed using UV spectroscopy together with chemometric techniques: principal component analysis (PCA), soft independent modeling of class analogy (SIMCA) and hierarchical cluster analysis (HCA). The model was able to discriminate T. vulgaris from other Thymus, Satureja, Origanum, Plectranthus and Eriocephalus species, all traded in the Egyptian market as different types of thyme. The model was also able to classify closely related species in clusters using PCA and HCA. The model was finally used to classify 12 commercial thyme varieties into clusters of species incorporated in the model as thyme or non-thyme. The model constructed is highly recommended as a simple and efficient method for distinguishing T. vulgaris from other related species as well as the classification of marketed herbs as thyme or non-thyme. Copyright © 2013 John Wiley & Sons, Ltd.

  1. Behaviour change techniques and contraceptive use in low and middle income countries: a review.

    PubMed

    Phiri, Mwelwa; King, R; Newell, J N

    2015-10-30

    We aimed to identify effective behaviour change techniques to increase modern contraceptive use in low and middle income countries (LMICs). Literature was identified in Global Health, Web of Science, MEDLINE, PsycINFO and Popline, as well as peer reviewed journals. Articles were included if they were written in English, had an outcome evaluation of contraceptive use, modern contraceptive use, contraceptive initiation/uptake, contraceptive adherence or continuation of contraception, were a systematic review or randomised controlled trial, and were conducted in a low or middle income country. We assessed the behaviour change techniques used in each intervention and included a new category of male partner involvement. We identified six studies meeting the inclusion criteria. The most effective interventions were those that involve male partner involvement in the decision to initiate contraceptive use. The findings also suggest that providing access to contraceptives in the community promotes their use. The interventions that had positive effects on contraceptive use used a combination of behaviour change techniques. Performance techniques were not used in any of the interventions. The use of social support techniques, which are meant to improve wider social acceptability, did not appear except in two of the interventions. Our findings suggest that when information and contraceptives are provided, contraceptive use improves. Recommendations include reporting of behaviour change studies to include more details of the intervention and techniques employed. There is also a need for further research to understand which techniques are especially effective.

  2. [Multifocal intraocular lenses. A review].

    PubMed

    Auffarth, G U; Dick, H B

    2001-02-01

    Modern cataract surgery has developed tremendously during the past 10-15 years. Improved surgical techniques, as well as improved implant materials and designs, have enlarged patient profiles and indications for cataract surgery. This also created much higher expectations from the patients' site. The loss of accommodation is loss of quality of life for presbyopic and especially young pseudophakic patients. Therefore cataract surgery with multifocal IOL implantation is not only of academic interest, but reflects demands and expectations of our patients. Multifocal IOLs have been implanted since 1986, starting with 2-3 zone refractive and diffractive designs. Due to surgical techniques of that time MIOL decentration and surgically induced astigmatism were possible complications. In addition reduced contrast sensitivity and increased glare were common problems of MIOL because of their optical principles. New developments in this field in recent years such as the multizonal, progressive refractive MIOL in combination with improved surgical techniques have overcome those initial problems. Therefore, modern multifocal IOLs can be considered not only for correction of aphakia but also for refractive purposes.

  3. The Political Persuaders; The Techniques of Modern Election Campaigns.

    ERIC Educational Resources Information Center

    Nimmo, Dan

    Over the last 20 years, a successful election campaign has come to depend in large part on successful use of the broadcast media. As a result, media experts are part of most politicians' teams, and their strategies help determine the results of the election. Usually, themes or "images" are more important than issues. The techniques of mass…

  4. [MLPA technique--principles and use in practice].

    PubMed

    Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia

    2007-01-01

    MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.

  5. Benefits of advanced software techniques for mission planning systems

    NASA Technical Reports Server (NTRS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-01-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  6. Benefits of advanced software techniques for mission planning systems

    NASA Astrophysics Data System (ADS)

    Gasquet, A.; Parrod, Y.; Desaintvincent, A.

    1994-10-01

    The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.

  7. Classroom Techniques: Foreign Languages and English as Second Language.

    ERIC Educational Resources Information Center

    Allen, Edward David; Valette, Rebecca M.

    The aim of the handbook, which is a revised and expanded edition of "Modern Language Classroom Techniques" (1972), is to show the teacher ways of implementing and supplementing existing materials. The suggested teaching procedures may be used with classes of varying sizes and levels, and with any method. Part One of this handbook presents an…

  8. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    PubMed Central

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839

  9. Complementary approaches to diagnosing marine diseases: a union of the modern and the classic

    USGS Publications Warehouse

    Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca

    2016-01-01

    Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.

  10. Cometary Astrometry

    NASA Technical Reports Server (NTRS)

    Yeomans, D. K. (Editor); West, R. M. (Editor); Harrington, R. S. (Editor); Marsden, B. G. (Editor)

    1984-01-01

    Modern techniques for making cometary astrometric observations, reducing these observations, using accurate reference star catalogs, and computing precise orbits and ephemerides are discussed in detail and recommendations and suggestions are given in each area.

  11. Molecules in the mud: Combining ancient DNA and lipid biomarkers to reconstruct vegetation response to climate variability during the Last Interglacial and the Holocene on Baffin Island, Arctic Canada

    NASA Astrophysics Data System (ADS)

    Crump, S. E.; Sepúlveda, J.; Bunce, M.; Miller, G. H.

    2017-12-01

    Modern ecological studies are revealing that the "greening" of the Arctic, resulting from a poleward shift in woody vegetation ranges, is already underway. The increasing abundance of shrubs in tundra ecosystems plays an important role in the global climate system through multiple positive feedbacks, yet uncertainty in future predictions of terrestrial vegetation means that climate models are likely not capturing these feedbacks accurately. Recently developed molecular techniques for reconstructing past vegetation and climate allow for a closer look at the paleo-record in order to improve our understanding of tundra community responses to climate variability; our current research focus is to apply these tools to both Last Interglacial and Holocene warm times. Here we present initial results from a small lake on southern Baffin Island spanning the last 7.2 ka. We reconstruct climate with both bulk geochemical and biomarker proxies, primarily using biogenic silica and branched glycerol dialkyl glycerol tetraethers (brGDGTs) as temperature indicators. We assess shifts in plant community using multivariate analysis of sedimentary ancient DNA (sedaDNA) metabarcoding data. This combination of approaches reveals that the vegetation community has responded sensitively to early Holocene warmth, Neoglacial cooling, and possibly modern anthropogenic warming. To our knowledge, this represents the first combination of a quantitative, biomarker-based climate reconstruction with a sedaDNA-based paleoecological reconstruction, and offers a glimpse at the potential of these molecular techniques used in tandem.

  12. Role of veterinarians in modern food hygiene

    PubMed Central

    Matyáš, Z.

    1978-01-01

    Veterinary services and veterinary education and training must keep pace with the constantly changing patterns of agriculture and food processing. Changes in methods of animal production are associated with many problems of food processing and food quality. Veterinary supervision of the animal feed industry and of meat and distribution is essential. Quality testing of meat, milk, and eggs requires the introduction of suitable routine sampling systems, laboratory procedures, and complex evaluation procedures. Food hygiene problems have changed in recent years not only as a result of new methods of animal production, but also because of changes in food processing technology and in the presentation of food to the consumer, increased environmental pollution, increased international trade, and increased tourist travel. Food hygienists must adopt an active and progressive policy and change the scope of food control from a purely negative measure into a positive force working towards improved food quality and the avoidance of losses during production. A modern food hygiene programme should cover all stages of production, processing, and distribution of food and also other ingredients, additives and the water used for production and processing. Veterinarians should also be involved in the registration and licensing of enterprises and this should take into account the premises, the procedures to be used, new techniques in animal husbandry, machines and equipment, etc. In order to facilitate the microbiological analysis of foodstuffs, new mechanized or automated laboratory methods are required, and consideration must be given to adequate sampling techniques. PMID:310716

  13. The modern role of transoesophageal echocardiography in the assessment of valvular pathologies

    PubMed Central

    Bull, Sacha; Newton, James

    2017-01-01

    Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed the detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves and provide guidance for the evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. PMID:28096184

  14. The modern role of transoesophageal echocardiography in the assessment of valvular pathologies.

    PubMed

    Wamil, Malgorzata; Bull, Sacha; Newton, James

    2017-01-17

    Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves, and provide guidance for evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. © 2017 The authors.

  15. The use of modern measurement techniques for designing pro ecological constructions

    NASA Astrophysics Data System (ADS)

    Wieczorowski, Michał; Gapiński, Bartosz; Szymański, Maciej; Rękas, Artur

    2017-10-01

    In the paper some possibilities of application modern length and angle metrology techniques to design constructions that support ecology were presented. The paper is based on a project where a lighter bus and train car seat was developed. Different options were presented including static and dynamic photogrammetry, computed tomography and thermography. Research related with dynamic behaviour of designed structures gave input to determine deformation of a seat and passengers sitting on it during communication accidents. Works connected to strength of construction elements made it possible to optimize its dimensions maintaining proper durability. Metrological actions taken in relation to production machines and equipment enabled to better recognize phenomena that take place during manufacturing process and to correct its parameters, what in turns also contributed to slim down the construction.

  16. PET/CT in Radiation Therapy Planning.

    PubMed

    Specht, Lena; Berthelsen, Anne Kiil

    2018-01-01

    Radiation therapy (RT) is an important component of the management of lymphoma patients. Most lymphomas are metabolically active and accumulate 18 F-fluorodeoxyglucose (FDG). Positron emission tomography with computer tomography (PET/CT) imaging using FDG is used routinely in staging and treatment evaluation. FDG-PET/CT imaging is now also used routinely for contouring the target for RT, and has been shown to change the irradiated volume significantly compared with CT imaging alone. Modern advanced imaging techniques with image fusion and motion management in combination with modern highly conformal RT techniques have increased the precision of RT, and have made it possible to reduce dramatically the risks of long-term side effects of treatment while maintaining the high cure rates for these diseases. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Fractures of modern high nitrogen stainless steel cemented stems: cause, mechanism, and avoidance in 14 cases.

    PubMed

    Yates, Piers J; Quraishi, Nasir A; Kop, Allan; Howie, Donald W; Marx, Clare; Swarts, Eric

    2008-02-01

    We present 14 cases of fracture of modern, high-nitrogen, stainless steel stems. Our clinical and radiological data suggest that heavy patients with small stems and poor proximal support are at risk for fracturing their implants. "Champagne-glass" canals can lead to the use of smaller stems often placed in varus, which can lead to cantilever bending and fatigue failure in the distal half of the stem. Metallurgical assessment of the retrieved high-nitrogen, stainless steel stems reveals microstructural inconsistencies that may contribute to their failure. Based on our findings, careful consideration and attention to technique is required when using stainless steel stems in patients with high body mass index or high weight. Technique is particularly important in femurs with champagne-glass canals.

  18. The use of biochemical methods in extraterrestrial life detection

    NASA Astrophysics Data System (ADS)

    McDonald, Gene

    2006-08-01

    Instrument development for in situ extraterrestrial life detection focuses primarily on the ability to distinguish between biological and non-biological material, mostly through chemical analysis for potential biosignatures (e.g., biogenic minerals, enantiomeric excesses). In constrast, biochemical analysis techniques commonly applied to Earth life focus primarily on the exploration of cellular and molecular processes, not on the classification of a given system as biological or non-biological. This focus has developed because of the relatively large functional gap between life and non-life on Earth today. Life on Earth is very diverse from an environmental and physiological point of view, but is highly conserved from a molecular point of view. Biochemical analysis techniques take advantage of this similarity of all terrestrial life at the molecular level, particularly through the use of biologically-derived reagents (e.g., DNA polymerases, antibodies), to enable analytical methods with enormous sensitivity and selectivity. These capabilities encourage consideration of such reagents and methods for use in extraterrestrial life detection instruments. The utility of this approach depends in large part on the (unknown at this time) degree of molecular compositional differences between extraterrestrial and terrestrial life. The greater these differences, the less useful laboratory biochemical techniques will be without significant modification. Biochemistry and molecular biology methods may need to be "de-focused" in order to produce instruments capable of unambiguously detecting a sufficiently wide range of extraterrestrial biochemical systems. Modern biotechnology tools may make that possible in some cases.

  19. [application of the analytical transmission electron microscopy techniques for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in mammalian cells].

    PubMed

    Shebanova, A S; Bogdanov, A G; Ismagulova, T T; Feofanov, A V; Semenyuk, P I; Muronets, V I; Erokhina, M V; Onishchenko, G E; Kirpichnikov, M P; Shaitan, K V

    2014-01-01

    This work represents the results of the study on applicability of the modern methods of analytical transmission electron microscopy for detection, identification and visualization of localization of nanoparticles of titanium and cerium oxides in A549 cell, human lung adenocarcinoma cell line. A comparative analysis of images of the nanoparticles in the cells obtained in the bright field mode of transmission electron microscopy, under dark-field scanning transmission electron microscopy and high-angle annular dark field scanning transmission electron was performed. For identification of nanoparticles in the cells the analytical techniques, energy-dispersive X-ray spectroscopy and electron energy loss spectroscopy, were compared when used in the mode of obtaining energy spectrum from different particles and element mapping. It was shown that the method for electron tomography is applicable to confirm that nanoparticles are localized in the sample but not coated by contamination. The possibilities and fields of utilizing different techniques for analytical transmission electron microscopy for detection, visualization and identification of nanoparticles in the biological samples are discussed.

  20. Conceptual Model Evaluation using Advanced Parameter Estimation Techniques with Heat as a Tracer

    NASA Astrophysics Data System (ADS)

    Naranjo, R. C.; Morway, E. D.; Healy, R. W.

    2016-12-01

    Temperature measurements made at multiple depths beneath the sediment-water interface has proven useful for estimating seepage rates from surface-water channels and corresponding subsurface flow direction. Commonly, parsimonious zonal representations of the subsurface structure are defined a priori by interpretation of temperature envelopes, slug tests or analysis of soil cores. However, combining multiple observations into a single zone may limit the inverse model solution and does not take full advantage of the information content within the measured data. Further, simulating the correct thermal gradient, flow paths, and transient behavior of solutes may be biased by inadequacies in the spatial description of subsurface hydraulic properties. The use of pilot points in PEST offers a more sophisticated approach to estimate the structure of subsurface heterogeneity. This presentation evaluates seepage estimation in a cross-sectional model of a trapezoidal canal with intermittent flow representing four typical sedimentary environments. The recent improvements in heat as a tracer measurement techniques (i.e. multi-depth temperature probe) along with use of modern calibration techniques (i.e., pilot points) provides opportunities for improved calibration of flow models, and, subsequently, improved model predictions.

Top