ERIC Educational Resources Information Center
Wilcox, Rand R.; Serang, Sarfaraz
2017-01-01
The article provides perspectives on p values, null hypothesis testing, and alternative techniques in light of modern robust statistical methods. Null hypothesis testing and "p" values can provide useful information provided they are interpreted in a sound manner, which includes taking into account insights and advances that have…
Wilcox, Rand; Carlson, Mike; Azen, Stan; Clark, Florence
2013-03-01
Recently, there have been major advances in statistical techniques for assessing central tendency and measures of association. The practical utility of modern methods has been documented extensively in the statistics literature, but they remain underused and relatively unknown in clinical trials. Our objective was to address this issue. STUDY DESIGN AND PURPOSE: The first purpose was to review common problems associated with standard methodologies (low power, lack of control over type I errors, and incorrect assessments of the strength of the association). The second purpose was to summarize some modern methods that can be used to circumvent such problems. The third purpose was to illustrate the practical utility of modern robust methods using data from the Well Elderly 2 randomized controlled trial. In multiple instances, robust methods uncovered differences among groups and associations among variables that were not detected by classic techniques. In particular, the results demonstrated that details of the nature and strength of the association were sometimes overlooked when using ordinary least squares regression and Pearson correlation. Modern robust methods can make a practical difference in detecting and describing differences between groups and associations between variables. Such procedures should be applied more frequently when analyzing trial-based data. Copyright © 2013 Elsevier Inc. All rights reserved.
2011-07-01
joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement. Analyzing data with modern statistical techniques to determine the
Strategies for Fermentation Medium Optimization: An In-Depth Review
Singh, Vineeta; Haque, Shafiul; Niwas, Ram; Srivastava, Akansha; Pasupuleti, Mukesh; Tripathi, C. K. M.
2017-01-01
Optimization of production medium is required to maximize the metabolite yield. This can be achieved by using a wide range of techniques from classical “one-factor-at-a-time” to modern statistical and mathematical techniques, viz. artificial neural network (ANN), genetic algorithm (GA) etc. Every technique comes with its own advantages and disadvantages, and despite drawbacks some techniques are applied to obtain best results. Use of various optimization techniques in combination also provides the desirable results. In this article an attempt has been made to review the currently used media optimization techniques applied during fermentation process of metabolite production. Comparative analysis of the merits and demerits of various conventional as well as modern optimization techniques have been done and logical selection basis for the designing of fermentation medium has been given in the present review. Overall, this review will provide the rationale for the selection of suitable optimization technique for media designing employed during the fermentation process of metabolite production. PMID:28111566
A comparison of linear and nonlinear statistical techniques in performance attribution.
Chan, N H; Genovese, C R
2001-01-01
Performance attribution is usually conducted under the linear framework of multifactor models. Although commonly used by practitioners in finance, linear multifactor models are known to be less than satisfactory in many situations. After a brief survey of nonlinear methods, nonlinear statistical techniques are applied to performance attribution of a portfolio constructed from a fixed universe of stocks using factors derived from some commonly used cross sectional linear multifactor models. By rebalancing this portfolio monthly, the cumulative returns for procedures based on standard linear multifactor model and three nonlinear techniques-model selection, additive models, and neural networks-are calculated and compared. It is found that the first two nonlinear techniques, especially in combination, outperform the standard linear model. The results in the neural-network case are inconclusive because of the great variety of possible models. Although these methods are more complicated and may require some tuning, toolboxes are developed and suggestions on calibration are proposed. This paper demonstrates the usefulness of modern nonlinear statistical techniques in performance attribution.
NASA Astrophysics Data System (ADS)
Pollard, David; Chang, Won; Haran, Murali; Applegate, Patrick; DeConto, Robert
2016-05-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ˜ 20 000 yr. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. The analyses provide sea-level-rise envelopes with well-defined parametric uncertainty bounds, but the simple averaging method only provides robust results with full-factorial parameter sampling in the large ensemble. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree well with the more advanced techniques. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds.
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Learning for Semantic Parsing Using Statistical Syntactic Parsing Techniques
2010-05-01
Workshop on Supervisory Con- trol of Learning and Adaptive Systems. San Jose, CA. Roland Kuhn and Renato De Mori (1995). The application of semantic...Processing (EMNLP-09), pp. 1–10. Suntec,Singapore. Ana-Maria Popescu, Alex Armanasu, Oren Etzioni, David Ko and Alexander Yates (2004). Modern natural
Interactive Visualization of Assessment Data: The Software Package Mondrian
ERIC Educational Resources Information Center
Unlu, Ali; Sargin, Anatol
2009-01-01
Mondrian is state-of-the-art statistical data visualization software featuring modern interactive visualization techniques for a wide range of data types. This article reviews the capabilities, functionality, and interactive properties of this software package. Key features of Mondrian are illustrated with data from the Programme for International…
Liao, Xing; Xie, Yan-ming
2014-10-01
The impact of evidence-based medicine and clinical epidemiology on clinical research has contributed to the development of Chinese medicine in modern times over the past two decades. Many concepts and methods of modern science and technology are emerging in Chinese medicine research, resulting in constant progress. Systematic reviews, randomized controlled trials and other advanced mathematic approaches and statistical analysis methods have brought reform to Chinese medicine. In this new era, Chinese medicine researchers have many opportunities and challenges. On the one hand, Chinese medicine researchers need to dedicate themselves to providing enough evidence to the world through rigorous studies, whilst on the other hand, they also need to keep up with the speed of modern medicine research. For example, recently, real world study, comparative effectiveness research, propensity score techniques and registry study have emerged. This article aims to inspire Chinese medicine researchers to explore new areas by introducing these new ideas and new techniques.
Applications of modern statistical methods to analysis of data in physical science
NASA Astrophysics Data System (ADS)
Wicker, James Eric
Modern methods of statistical and computational analysis offer solutions to dilemmas confronting researchers in physical science. Although the ideas behind modern statistical and computational analysis methods were originally introduced in the 1970's, most scientists still rely on methods written during the early era of computing. These researchers, who analyze increasingly voluminous and multivariate data sets, need modern analysis methods to extract the best results from their studies. The first section of this work showcases applications of modern linear regression. Since the 1960's, many researchers in spectroscopy have used classical stepwise regression techniques to derive molecular constants. However, problems with thresholds of entry and exit for model variables plagues this analysis method. Other criticisms of this kind of stepwise procedure include its inefficient searching method, the order in which variables enter or leave the model and problems with overfitting data. We implement an information scoring technique that overcomes the assumptions inherent in the stepwise regression process to calculate molecular model parameters. We believe that this kind of information based model evaluation can be applied to more general analysis situations in physical science. The second section proposes new methods of multivariate cluster analysis. The K-means algorithm and the EM algorithm, introduced in the 1960's and 1970's respectively, formed the basis of multivariate cluster analysis methodology for many years. However, several shortcomings of these methods include strong dependence on initial seed values and inaccurate results when the data seriously depart from hypersphericity. We propose new cluster analysis methods based on genetic algorithms that overcomes the strong dependence on initial seed values. In addition, we propose a generalization of the Genetic K-means algorithm which can accurately identify clusters with complex hyperellipsoidal covariance structures. We then use this new algorithm in a genetic algorithm based Expectation-Maximization process that can accurately calculate parameters describing complex clusters in a mixture model routine. Using the accuracy of this GEM algorithm, we assign information scores to cluster calculations in order to best identify the number of mixture components in a multivariate data set. We will showcase how these algorithms can be used to process multivariate data from astronomical observations.
van der Ploeg, Tjeerd; Nieboer, Daan; Steyerberg, Ewout W
2016-10-01
Prediction of medical outcomes may potentially benefit from using modern statistical modeling techniques. We aimed to externally validate modeling strategies for prediction of 6-month mortality of patients suffering from traumatic brain injury (TBI) with predictor sets of increasing complexity. We analyzed individual patient data from 15 different studies including 11,026 TBI patients. We consecutively considered a core set of predictors (age, motor score, and pupillary reactivity), an extended set with computed tomography scan characteristics, and a further extension with two laboratory measurements (glucose and hemoglobin). With each of these sets, we predicted 6-month mortality using default settings with five statistical modeling techniques: logistic regression (LR), classification and regression trees, random forests (RFs), support vector machines (SVM) and neural nets. For external validation, a model developed on one of the 15 data sets was applied to each of the 14 remaining sets. This process was repeated 15 times for a total of 630 validations. The area under the receiver operating characteristic curve (AUC) was used to assess the discriminative ability of the models. For the most complex predictor set, the LR models performed best (median validated AUC value, 0.757), followed by RF and support vector machine models (median validated AUC value, 0.735 and 0.732, respectively). With each predictor set, the classification and regression trees models showed poor performance (median validated AUC value, <0.7). The variability in performance across the studies was smallest for the RF- and LR-based models (inter quartile range for validated AUC values from 0.07 to 0.10). In the area of predicting mortality from TBI, nonlinear and nonadditive effects are not pronounced enough to make modern prediction methods beneficial. Copyright © 2016 Elsevier Inc. All rights reserved.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
NASA Astrophysics Data System (ADS)
Brennan, Kevin F.
1999-02-01
Modern fabrication techniques have made it possible to produce semiconductor devices whose dimensions are so small that quantum mechanical effects dominate their behavior. This book describes the key elements of quantum mechanics, statistical mechanics, and solid-state physics that are necessary in understanding these modern semiconductor devices. The author begins with a review of elementary quantum mechanics, and then describes more advanced topics, such as multiple quantum wells. He then disusses equilibrium and nonequilibrium statistical mechanics. Following this introduction, he provides a thorough treatment of solid-state physics, covering electron motion in periodic potentials, electron-phonon interaction, and recombination processes. The final four chapters deal exclusively with real devices, such as semiconductor lasers, photodiodes, flat panel displays, and MOSFETs. The book contains many homework exercises and is suitable as a textbook for electrical engineering, materials science, or physics students taking courses in solid-state device physics. It will also be a valuable reference for practicing engineers in optoelectronics and related areas.
Advances in the microrheology of complex fluids
NASA Astrophysics Data System (ADS)
Waigh, Thomas Andrew
2016-07-01
New developments in the microrheology of complex fluids are considered. Firstly the requirements for a simple modern particle tracking microrheology experiment are introduced, the error analysis methods associated with it and the mathematical techniques required to calculate the linear viscoelasticity. Progress in microrheology instrumentation is then described with respect to detectors, light sources, colloidal probes, magnetic tweezers, optical tweezers, diffusing wave spectroscopy, optical coherence tomography, fluorescence correlation spectroscopy, elastic- and quasi-elastic scattering techniques, 3D tracking, single molecule methods, modern microscopy methods and microfluidics. New theoretical techniques are also reviewed such as Bayesian analysis, oversampling, inversion techniques, alternative statistical tools for tracks (angular correlations, first passage probabilities, the kurtosis, motor protein step segmentation etc), issues in micro/macro rheological agreement and two particle methodologies. Applications where microrheology has begun to make some impact are also considered including semi-flexible polymers, gels, microorganism biofilms, intracellular methods, high frequency viscoelasticity, comb polymers, active motile fluids, blood clots, colloids, granular materials, polymers, liquid crystals and foods. Two large emergent areas of microrheology, non-linear microrheology and surface microrheology are also discussed.
NASA Astrophysics Data System (ADS)
Stock, Michala K.; Stull, Kyra E.; Garvin, Heather M.; Klales, Alexandra R.
2016-10-01
Forensic anthropologists are routinely asked to estimate a biological profile (i.e., age, sex, ancestry and stature) from a set of unidentified remains. In contrast to the abundance of collections and techniques associated with adult skeletons, there is a paucity of modern, documented subadult skeletal material, which limits the creation and validation of appropriate forensic standards. Many are forced to use antiquated methods derived from small sample sizes, which given documented secular changes in the growth and development of children, are not appropriate for application in the medico-legal setting. Therefore, the aim of this project is to use multi-slice computed tomography (MSCT) data from a large, diverse sample of modern subadults to develop new methods to estimate subadult age and sex for practical forensic applications. The research sample will consist of over 1,500 full-body MSCT scans of modern subadult individuals (aged birth to 20 years) obtained from two U.S. medical examiner's offices. Statistical analysis of epiphyseal union scores, long bone osteometrics, and os coxae landmark data will be used to develop modern subadult age and sex estimation standards. This project will result in a database of information gathered from the MSCT scans, as well as the creation of modern, statistically rigorous standards for skeletal age and sex estimation in subadults. Furthermore, the research and methods developed in this project will be applicable to dry bone specimens, MSCT scans, and radiographic images, thus providing both tools and continued access to data for forensic practitioners in a variety of settings.
NASA Astrophysics Data System (ADS)
Pollard, D.; Chang, W.; Haran, M.; Applegate, P.; DeConto, R.
2015-11-01
A 3-D hybrid ice-sheet model is applied to the last deglacial retreat of the West Antarctic Ice Sheet over the last ~ 20 000 years. A large ensemble of 625 model runs is used to calibrate the model to modern and geologic data, including reconstructed grounding lines, relative sea-level records, elevation-age data and uplift rates, with an aggregate score computed for each run that measures overall model-data misfit. Two types of statistical methods are used to analyze the large-ensemble results: simple averaging weighted by the aggregate score, and more advanced Bayesian techniques involving Gaussian process-based emulation and calibration, and Markov chain Monte Carlo. Results for best-fit parameter ranges and envelopes of equivalent sea-level rise with the simple averaging method agree quite well with the more advanced techniques, but only for a large ensemble with full factorial parameter sampling. Best-fit parameter ranges confirm earlier values expected from prior model tuning, including large basal sliding coefficients on modern ocean beds. Each run is extended 5000 years into the "future" with idealized ramped climate warming. In the majority of runs with reasonable scores, this produces grounding-line retreat deep into the West Antarctic interior, and the analysis provides sea-level-rise envelopes with well defined parametric uncertainty bounds.
Correlative weighted stacking for seismic data in the wavelet domain
Zhang, S.; Xu, Y.; Xia, J.; ,
2004-01-01
Horizontal stacking plays a crucial role for modern seismic data processing, for it not only compresses random noise and multiple reflections, but also provides a foundational data for subsequent migration and inversion. However, a number of examples showed that random noise in adjacent traces exhibits correlation and coherence. The average stacking and weighted stacking based on the conventional correlative function all result in false events, which are caused by noise. Wavelet transform and high order statistics are very useful methods for modern signal processing. The multiresolution analysis in wavelet theory can decompose signal on difference scales, and high order correlative function can inhibit correlative noise, for which the conventional correlative function is of no use. Based on the theory of wavelet transform and high order statistics, high order correlative weighted stacking (HOCWS) technique is presented in this paper. Its essence is to stack common midpoint gathers after the normal moveout correction by weight that is calculated through high order correlative statistics in the wavelet domain. Synthetic examples demonstrate its advantages in improving the signal to noise (S/N) ration and compressing the correlative random noise.
ERIC Educational Resources Information Center
Haberman, Shelby J.; Lee, Yi-Hsuan
2017-01-01
In investigations of unusual testing behavior, a common question is whether a specific pattern of responses occurs unusually often within a group of examinees. In many current tests, modern communication techniques can permit quite large numbers of examinees to share keys, or common response patterns, to the entire test. To address this issue,…
Ferreira, Mayra Soares; Mangussi-Gomes, João; Ximendes, Roberta; Evangelista, Anne Rosso; Miranda, Eloá Lumi; Garcia, Leonardo Bomediano; Stamm, Aldo C
2018-01-01
Pharyngeal tonsil hyperplasia is the most frequent cause of nasal obstruction and chronic mouth breathing during childhood. Adenoidectomy is the procedure of choice for the resolution of these symptoms. It is not yet known, however, whether the conventional technique ("blind curettage") has been surpassed by more modern adenoidectomy techniques (video-assisted, with the aid of instruments). This study aimed to compare the conventional adenoidectomy technique with two other emerging techniques, performed in a reference otorhinolaryngology center. This is a prospective and observational study of 33 children submitted to adenoidectomy using 3 different techniques that were followed up for a period of 3 months after surgery. The patients were divided into 3 different groups, according to the adenoidectomy technique: Group A (conventional technique - "blind curettage"); Group B (video-assisted adenoidectomy with microdebrider); Group C (video-assisted adenoidectomy with radiofrequency - Coblation ® ). The surgical time of each procedure was measured, being considered from the moment of insertion of the mouth gag until complete hemostasis was achieved. The questionnaire for quality of life OSA-18 was applied to all caregivers on the day of the surgery and 30-90 days after the procedure. Postoperative complications were also analyzed. For the entire patient sample, there was an improvement in quality of life after the surgery (p < 0.05). When analyzing the evolution of OSA-18 index, all groups showed statistically significant improvement, for all assessed domains. There were no statistically significant differences between the 3 techniques assessed for quality of life improvement after the surgery (p > 0.05). Regarding the duration of the procedure, the conventional technique showed the shortest surgical time when compared to the others (p < 0.05). No postoperative complications were noted, for any patient. The adenoidectomy resulted in improvement of quality of life, and there were no major postoperative complications, for all operated children, regardless of the technique used. The conventional technique was faster when compared to the more modern adenoidectomy techniques. Copyright © 2017 Elsevier B.V. All rights reserved.
Damron, T A; McBeath, A A
1995-04-01
With the increasing duration of follow up on total knee arthroplasties, more revision arthroplasties are being performed. When revision is not advisable, a salvage procedure such as arthrodesis or resection arthroplasty is indicated. This article provides a comprehensive review of the literature regarding arthrodesis following failed total knee arthroplasty. In addition, a statistical meta-analysis of five studies using modern arthrodesis techniques is presented. A statistically significant greater fusion rate with intramedullary nail arthrodesis compared to external fixation is documented. Gram negative and mixed infections are found to be significant risk factors for failure of arthrodesis.
[The development of hospital medical supplies information management system].
Cao, Shaoping; Gu, Hongqing; Zhang, Peng; Wang, Qiang
2010-05-01
The information management of medical materials by using high-tech computer, in order to improve the efficiency of the consumption of medical supplies, hospital supplies and develop a new technology way to manage the hospital and material support. Using C # NET, JAVA techniques to develop procedures for the establishment of hospital material management information system, set the various management modules, production of various statistical reports, standard operating procedures. The system is convenient, functional and strong, fluent statistical functions. It can always fully grasp and understand the whole hospital supplies run dynamic information, as a modern and effective tool for hospital materials management.
Mathematical Optimization Techniques
NASA Technical Reports Server (NTRS)
Bellman, R. (Editor)
1963-01-01
The papers collected in this volume were presented at the Symposium on Mathematical Optimization Techniques held in the Santa Monica Civic Auditorium, Santa Monica, California, on October 18-20, 1960. The objective of the symposium was to bring together, for the purpose of mutual education, mathematicians, scientists, and engineers interested in modern optimization techniques. Some 250 persons attended. The techniques discussed included recent developments in linear, integer, convex, and dynamic programming as well as the variational processes surrounding optimal guidance, flight trajectories, statistical decisions, structural configurations, and adaptive control systems. The symposium was sponsored jointly by the University of California, with assistance from the National Science Foundation, the Office of Naval Research, the National Aeronautics and Space Administration, and The RAND Corporation, through Air Force Project RAND.
Statistics without Tears: Complex Statistics with Simple Arithmetic
ERIC Educational Resources Information Center
Smith, Brian
2011-01-01
One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
NASA Astrophysics Data System (ADS)
Saadi, Saad
2017-04-01
Characterizing the complexity and heterogeneity of the geometries and deposits in meandering river system is an important concern for the reservoir modelling of fluvial environments. Re-examination of the Long Nab member in the Scalby formation of the Ravenscar Group (Yorkshire, UK), integrating digital outcrop data and forward modelling approaches, will lead to a geologically realistic numerical model of the meandering river geometry. The methodology is based on extracting geostatistics from modern analogous, meandering rivers that exemplify both the confined and non-confined meandering point bars deposits and morphodynamics of Long Nab member. The parameters derived from the modern systems (i.e. channel width, amplitude, radius of curvature, sinuosity, wavelength, channel length and migration rate) are used as a statistical control for the forward simulation and resulting object oriented channel models. The statistical data derived from the modern analogues is multi-dimensional in nature, making analysis difficult. We apply data mining techniques such as parallel coordinates to investigate and identify the important relationships within the modern analogue data, which can then be used drive the development of, and as input to the forward model. This work will increase our understanding of meandering river morphodynamics, planform architecture and stratigraphic signature of various fluvial deposits and features. We will then use these forward modelling based channel objects to build reservoir models, and compare the behaviour of the forward modelled channels with traditional object modelling in hydrocarbon flow simulations.
A performance model for GPUs with caches
Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...
2014-06-24
To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less
Multilayer Statistical Intrusion Detection in Wireless Networks
NASA Astrophysics Data System (ADS)
Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine
2008-12-01
The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.
Efficient Parameter Searches for Colloidal Materials Design with Digital Alchemy
NASA Astrophysics Data System (ADS)
Dodd, Paul, M.; Geng, Yina; van Anders, Greg; Glotzer, Sharon C.
Optimal colloidal materials design is challenging, even for high-throughput or genomic approaches, because the design space provided by modern colloid synthesis techniques can easily have dozens of dimensions. In this talk we present the methodology of an inverse approach we term ''digital alchemy'' to perform rapid searches of design-paramenter spaces with up to 188 dimensions that yield thermodynamically optimal colloid parameters for target crystal structures with up to 20 particles in a unit cell. The method relies only on fundamental principles of statistical mechanics and Metropolis Monte Carlo techniques, and yields particle attribute tolerances via analogues of familiar stress-strain relationships.
NASA Astrophysics Data System (ADS)
Chlebda, Damian K.; Majda, Alicja; Łojewski, Tomasz; Łojewska, Joanna
2016-11-01
Differentiation of the written text can be performed with a non-invasive and non-contact tool that connects conventional imaging methods with spectroscopy. Hyperspectral imaging (HSI) is a relatively new and rapid analytical technique that can be applied in forensic science disciplines. It allows an image of the sample to be acquired, with full spectral information within every pixel. For this paper, HSI and three statistical methods (hierarchical cluster analysis, principal component analysis, and spectral angle mapper) were used to distinguish between traces of modern black gel pen inks. Non-invasiveness and high efficiency are among the unquestionable advantages of ink differentiation using HSI. It is also less time-consuming than traditional methods such as chromatography. In this study, a set of 45 modern gel pen ink marks deposited on a paper sheet were registered. The spectral characteristics embodied in every pixel were extracted from an image and analysed using statistical methods, externally and directly on the hypercube. As a result, different black gel inks deposited on paper can be distinguished and classified into several groups, in a non-invasive manner.
A quality improvement management model for renal care.
Vlchek, D L; Day, L M
1991-04-01
The purpose of this article is to explore the potential for applying the theory and tools of quality improvement (total quality management) in the renal care setting. We believe that the coupling of the statistical techniques used in the Deming method of quality improvement, with modern approaches to outcome and process analysis, will provide the renal care community with powerful tools, not only for improved quality (i.e., reduced morbidity and mortality), but also for technology evaluation and resource allocation.
Random sequences generation through optical measurements by phase-shifting interferometry
NASA Astrophysics Data System (ADS)
François, M.; Grosges, T.; Barchiesi, D.; Erra, R.; Cornet, A.
2012-04-01
The development of new techniques for producing random sequences with a high level of security is a challenging topic of research in modern cryptographics. The proposed method is based on the measurement by phase-shifting interferometry of the speckle signals of the interaction between light and structures. We show how the combination of amplitude and phase distributions (maps) under a numerical process can produce random sequences. The produced sequences satisfy all the statistical requirements of randomness and can be used in cryptographic schemes.
Inverse statistical physics of protein sequences: a key issues review.
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Inverse statistical physics of protein sequences: a key issues review
NASA Astrophysics Data System (ADS)
Cocco, Simona; Feinauer, Christoph; Figliuzzi, Matteo; Monasson, Rémi; Weigt, Martin
2018-03-01
In the course of evolution, proteins undergo important changes in their amino acid sequences, while their three-dimensional folded structure and their biological function remain remarkably conserved. Thanks to modern sequencing techniques, sequence data accumulate at unprecedented pace. This provides large sets of so-called homologous, i.e. evolutionarily related protein sequences, to which methods of inverse statistical physics can be applied. Using sequence data as the basis for the inference of Boltzmann distributions from samples of microscopic configurations or observables, it is possible to extract information about evolutionary constraints and thus protein function and structure. Here we give an overview over some biologically important questions, and how statistical-mechanics inspired modeling approaches can help to answer them. Finally, we discuss some open questions, which we expect to be addressed over the next years.
Yu, Daxiong; Ma, Ruijie; Fang, Jianqiao
2015-05-01
There are many eminent acupuncture masters in modern times in the regions of Zhejiang province, which has developed the acupuncture schools of numerous characteristics and induces the important impacts at home and abroad. Through the literature collection on the acupuncture schools in Zhejiang and the interviews to the parties involved, it has been discovered that the acupuncture manipulation techniques of acupuncture masters in modern times are specifically featured. Those techniques are developed on the basis of Neijing (Internal Classic), Jinzhenfu (Ode to Gold Needle) and Zhenjiu Dacheng (Great Compendium of Acupuncture and Moxibustion). No matter to obey the old maxim or study by himself, every master lays the emphasis on the research and interpretation of classical theories and integrates the traditional with the modern. In the paper, the acupuncture manipulation techniques of Zhejiang acupuncture masters in modern times are stated from four aspects, named needling techniques in Internal Classic, feijingzouqi needling technique, penetrating needling technique and innovation of acupuncture manipulation.
Increasing the reliability of ecological models using modern software engineering techniques
Robert M. Scheller; Brian R. Sturtevant; Eric J. Gustafson; Brendan C. Ward; David J. Mladenoff
2009-01-01
Modern software development techniques are largely unknown to ecologists. Typically, ecological models and other software tools are developed for limited research purposes, and additional capabilities are added later, usually in an ad hoc manner. Modern software engineering techniques can substantially increase scientific rigor and confidence in ecological models and...
Statistics and Informatics in Space Astrophysics
NASA Astrophysics Data System (ADS)
Feigelson, E.
2017-12-01
The interest in statistical and computational methodology has seen rapid growth in space-based astrophysics, parallel to the growth seen in Earth remote sensing. There is widespread agreement that scientific interpretation of the cosmic microwave background, discovery of exoplanets, and classifying multiwavelength surveys is too complex to be accomplished with traditional techniques. NASA operates several well-functioning Science Archive Research Centers providing 0.5 PBy datasets to the research community. These databases are integrated with full-text journal articles in the NASA Astrophysics Data System (200K pageviews/day). Data products use interoperable formats and protocols established by the International Virtual Observatory Alliance. NASA supercomputers also support complex astrophysical models of systems such as accretion disks and planet formation. Academic researcher interest in methodology has significantly grown in areas such as Bayesian inference and machine learning, and statistical research is underway to treat problems such as irregularly spaced time series and astrophysical model uncertainties. Several scholarly societies have created interest groups in astrostatistics and astroinformatics. Improvements are needed on several fronts. Community education in advanced methodology is not sufficiently rapid to meet the research needs. Statistical procedures within NASA science analysis software are sometimes not optimal, and pipeline development may not use modern software engineering techniques. NASA offers few grant opportunities supporting research in astroinformatics and astrostatistics.
Four Bad Habits of Modern Psychologists
Grice, James; Cota, Lisa; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-01-01
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., “bad habits”) of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers’ theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices. PMID:28805739
Four Bad Habits of Modern Psychologists.
Grice, James; Barrett, Paul; Cota, Lisa; Felix, Crystal; Taylor, Zachery; Garner, Samantha; Medellin, Eliwid; Vest, Adam
2017-08-14
Four data sets from studies included in the Reproducibility Project were re-analyzed to demonstrate a number of flawed research practices (i.e., "bad habits") of modern psychology. Three of the four studies were successfully replicated, but re-analysis showed that in one study most of the participants responded in a manner inconsistent with the researchers' theoretical model. In the second study, the replicated effect was shown to be an experimental confound, and in the third study the replicated statistical effect was shown to be entirely trivial. The fourth study was an unsuccessful replication, yet re-analysis of the data showed that questioning the common assumptions of modern psychological measurement can lead to novel techniques of data analysis and potentially interesting findings missed by traditional methods of analysis. Considered together, these new analyses show that while it is true replication is a key feature of science, causal inference, modeling, and measurement are equally important and perhaps more fundamental to obtaining truly scientific knowledge of the natural world. It would therefore be prudent for psychologists to confront the limitations and flaws in their current analytical methods and research practices.
Proton Upset Monte Carlo Simulation
NASA Technical Reports Server (NTRS)
O'Neill, Patrick M.; Kouba, Coy K.; Foster, Charles C.
2009-01-01
The Proton Upset Monte Carlo Simulation (PROPSET) program calculates the frequency of on-orbit upsets in computer chips (for given orbits such as Low Earth Orbit, Lunar Orbit, and the like) from proton bombardment based on the results of heavy ion testing alone. The software simulates the bombardment of modern microelectronic components (computer chips) with high-energy (.200 MeV) protons. The nuclear interaction of the proton with the silicon of the chip is modeled and nuclear fragments from this interaction are tracked using Monte Carlo techniques to produce statistically accurate predictions.
Agarwal, Chitra; Deora, Savita; Abraham, Dennis; Gaba, Rohini; Kumar, Baron Tarun; Kudva, Praveen
2015-01-01
Context: Nowadays esthetics plays an important role in dentistry along with function of the prosthesis. Various soft tissue augmentation procedures are available to correct the ridge defects in the anterior region. The newer technique, vascularized interpositional periosteal connective tissue (VIP-CT) flap has been introduced, which has the potential to augment predictable amount of tissue and has many benefits when compared to other techniques. Aim: The study was designed to determine the efficacy of the VIP-CT flap in augmenting the ridge defect. Materials and Methods: Ten patients with Class III (Seibert's) ridge defects were treated with VIP-CT flap technique before fabricating fixed partial denture. Height and width of the ridge defects were measured before and after the procedure. Subsequent follow-up was done every 3 months for 1-year. Statistical Analysis Used: Paired t-test was performed to detect the significance of the procedure. Results: The surgical site healed uneventfully. The predictable amount of soft tissue augmentation had been achieved with the procedure. The increase in height and width of the ridge was statistically highly significant. Conclusion: The VIP-CT flap technique was effective in augmenting the soft tissue in esthetic area that remained stable over a long period. PMID:25810597
A BAYESIAN APPROACH TO DERIVING AGES OF INDIVIDUAL FIELD WHITE DWARFS
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Malley, Erin M.; Von Hippel, Ted; Van Dyk, David A., E-mail: ted.vonhippel@erau.edu, E-mail: dvandyke@imperial.ac.uk
2013-09-20
We apply a self-consistent and robust Bayesian statistical approach to determine the ages, distances, and zero-age main sequence (ZAMS) masses of 28 field DA white dwarfs (WDs) with ages of approximately 4-8 Gyr. Our technique requires only quality optical and near-infrared photometry to derive ages with <15% uncertainties, generally with little sensitivity to our choice of modern initial-final mass relation. We find that age, distance, and ZAMS mass are correlated in a manner that is too complex to be captured by traditional error propagation techniques. We further find that the posterior distributions of age are often asymmetric, indicating that themore » standard approach to deriving WD ages can yield misleading results.« less
You can run, you can hide: The epidemiology and statistical mechanics of zombies
NASA Astrophysics Data System (ADS)
Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.
2015-11-01
We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.
Current role of modern radiotherapy techniques in the management of breast cancer
Ozyigit, Gokhan; Gultekin, Melis
2014-01-01
Breast cancer is the most common type of malignancy in females. Advances in systemic therapies and radiotherapy (RT) provided long survival rates in breast cancer patients. RT has a major role in the management of breast cancer. During the past 15 years several developments took place in the field of imaging and irradiation techniques, intensity modulated RT, hypofractionation and partial-breast irradiation. Currently, improvements in the RT technology allow us a subsequent decrease in the treatment-related complications such as fibrosis and long-term cardiac toxicity while improving the loco-regional control rates and cosmetic results. Thus, it is crucial that modern radiotherapy techniques should be carried out with maximum care and efficiency. Several randomized trials provided evidence for the feasibility of modern radiotherapy techniques in the management of breast cancer. However, the role of modern radiotherapy techniques in the management of breast cancer will continue to be defined by the mature results of randomized trials. Current review will provide an up-to-date evidence based data on the role of modern radiotherapy techniques in the management of breast cancer. PMID:25114857
A guide to missing data for the pediatric nephrologist.
Larkins, Nicholas G; Craig, Jonathan C; Teixeira-Pinto, Armando
2018-03-13
Missing data is an important and common source of bias in clinical research. Readers should be alert to and consider the impact of missing data when reading studies. Beyond preventing missing data in the first place, through good study design and conduct, there are different strategies available to handle data containing missing observations. Complete case analysis is often biased unless data are missing completely at random. Better methods of handling missing data include multiple imputation and models using likelihood-based estimation. With advancing computing power and modern statistical software, these methods are within the reach of clinician-researchers under guidance of a biostatistician. As clinicians reading papers, we need to continue to update our understanding of statistical methods, so that we understand the limitations of these techniques and can critically interpret literature.
Current Developments in Machine Learning Techniques in Biological Data Mining.
Dumancas, Gerard G; Adrianto, Indra; Bello, Ghalib; Dozmorov, Mikhail
2017-01-01
This supplement is intended to focus on the use of machine learning techniques to generate meaningful information on biological data. This supplement under Bioinformatics and Biology Insights aims to provide scientists and researchers working in this rapid and evolving field with online, open-access articles authored by leading international experts in this field. Advances in the field of biology have generated massive opportunities to allow the implementation of modern computational and statistical techniques. Machine learning methods in particular, a subfield of computer science, have evolved as an indispensable tool applied to a wide spectrum of bioinformatics applications. Thus, it is broadly used to investigate the underlying mechanisms leading to a specific disease, as well as the biomarker discovery process. With a growth in this specific area of science comes the need to access up-to-date, high-quality scholarly articles that will leverage the knowledge of scientists and researchers in the various applications of machine learning techniques in mining biological data.
Niumsawatt, Vachara; Debrotwir, Andrew N; Rozen, Warren Matthew
2014-01-01
Computed tomographic angiography (CTA) has become a mainstay in preoperative perforator flap planning in the modern era of reconstructive surgery. However, the increased use of CTA does raise the concern of radiation exposure to patients. Several techniques have been developed to decrease radiation dosage without compromising image quality, with varying results. The most recent advance is in the improvement of image reconstruction using an adaptive statistical iterative reconstruction (ASIR) algorithm. We sought to evaluate the image quality of ASIR in preoperative deep inferior epigastric perforator (DIEP) flap surgery, through a direct comparison with conventional filtered back projection (FBP) images. A prospective review of 60 consecutive ASIR and 60 consecutive FBP CTA images using similar protocol (except for radiation dosage) was undertaken, analyzed by 2 independent reviewers. In both groups, we were able to accurately identify axial arteries and their perforators. Subjective analysis of image quality demonstrated no statistically significant difference between techniques. ASIR can thus be used for preoperative imaging with similar image quality to FBP, but with a 60% reduction in radiation delivery to patients.
Forensic analysis of dyed textile fibers.
Goodpaster, John V; Liszewski, Elisa A
2009-08-01
Textile fibers are a key form of trace evidence, and the ability to reliably associate or discriminate them is crucial for forensic scientists worldwide. While microscopic and instrumental analysis can be used to determine the composition of the fiber itself, additional specificity is gained by examining fiber color. This is particularly important when the bulk composition of the fiber is relatively uninformative, as it is with cotton, wool, or other natural fibers. Such analyses pose several problems, including extremely small sample sizes, the desire for nondestructive techniques, and the vast complexity of modern dye compositions. This review will focus on more recent methods for comparing fiber color by using chromatography, spectroscopy, and mass spectrometry. The increasing use of multivariate statistics and other data analysis techniques for the differentiation of spectra from dyed fibers will also be discussed.
NASA Astrophysics Data System (ADS)
Shirzaei, M.; Walter, T. R.
2009-10-01
Modern geodetic techniques provide valuable and near real-time observations of volcanic activity. Characterizing the source of deformation based on these observations has become of major importance in related monitoring efforts. We investigate two random search approaches, simulated annealing (SA) and genetic algorithm (GA), and utilize them in an iterated manner. The iterated approach helps to prevent GA in general and SA in particular from getting trapped in local minima, and it also increases redundancy for exploring the search space. We apply a statistical competency test for estimating the confidence interval of the inversion source parameters, considering their internal interaction through the model, the effect of the model deficiency, and the observational error. Here, we present and test this new randomly iterated search and statistical competency (RISC) optimization method together with GA and SA for the modeling of data associated with volcanic deformations. Following synthetic and sensitivity tests, we apply the improved inversion techniques to two episodes of activity in the Campi Flegrei volcanic region in Italy, observed by the interferometric synthetic aperture radar technique. Inversion of these data allows derivation of deformation source parameters and their associated quality so that we can compare the two inversion methods. The RISC approach was found to be an efficient method in terms of computation time and search results and may be applied to other optimization problems in volcanic and tectonic environments.
An application of artificial intelligence theory to reconfigurable flight control
NASA Technical Reports Server (NTRS)
Handelman, David A.
1987-01-01
Artificial intelligence techniques were used along with statistical hpyothesis testing and modern control theory, to help the pilot cope with the issues of information, knowledge, and capability in the event of a failure. An intelligent flight control system is being developed which utilizes knowledge of cause and effect relationships between all aircraft components. It will screen the information available to the pilots, supplement his knowledge, and most importantly, utilize the remaining flight capability of the aircraft following a failure. The list of failure types the control system will accommodate includes sensor failures, actuator failures, and structural failures.
Innovative Teaching Practice: Traditional and Alternative Methods (Challenges and Implications)
ERIC Educational Resources Information Center
Nurutdinova, Aida R.; Perchatkina, Veronika G.; Zinatullina, Liliya M.; Zubkova, Guzel I.; Galeeva, Farida T.
2016-01-01
The relevance of the present issue is caused be the strong need in alternative methods of learning foreign language and the need in language training and retraining for the modern professionals. The aim of the article is to identify the basic techniques and skills in using various modern techniques in the context of modern educational tasks. The…
Corradini, Stefanie; Ballhausen, Hendrik; Weingandt, Helmut; Freislederer, Philipp; Schönecker, Stephan; Niyazi, Maximilian; Simonetto, Cristoforo; Eidemüller, Markus; Ganswindt, Ute; Belka, Claus
2018-03-01
Modern breast cancer radiotherapy techniques, such as respiratory-gated radiotherapy in deep-inspiration breath-hold (DIBH) or volumetric-modulated arc radiotherapy (VMAT) have been shown to reduce the high dose exposure of the heart in left-sided breast cancer. The aim of the present study was to comparatively estimate the excess relative and absolute risks of radiation-induced secondary lung cancer and ischemic heart disease for different modern radiotherapy techniques. Four different treatment plans were generated for ten computed tomography data sets of patients with left-sided breast cancer, using either three-dimensional conformal radiotherapy (3D-CRT) or VMAT, in free-breathing (FB) or DIBH. Dose-volume histograms were used for organ equivalent dose (OED) calculations using linear, linear-exponential, and plateau models for the lung. A linear model was applied to estimate the long-term risk of ischemic heart disease as motivated by epidemiologic data. Excess relative risk (ERR) and 10-year excess absolute risk (EAR) for radiation-induced secondary lung cancer and ischemic heart disease were estimated for different representative baseline risks. The DIBH maneuver resulted in a significant reduction of the ERR and estimated 10-year excess absolute risk for major coronary events compared to FB in 3D-CRT plans (p = 0.04). In VMAT plans, the mean predicted risk reduction through DIBH was less pronounced and not statistically significant (p = 0.44). The risk of radiation-induced secondary lung cancer was mainly influenced by the radiotherapy technique, with no beneficial effect through DIBH. VMAT plans correlated with an increase in 10-year EAR for radiation-induced lung cancer as compared to 3D-CRT plans (DIBH p = 0.007; FB p = 0.005, respectively). However, the EARs were affected more strongly by nonradiation-associated risk factors, such as smoking, as compared to the choice of treatment technique. The results indicate that 3D-CRT plans in DIBH pose the lowest risk for both major coronary events and secondary lung cancer.
Impact of Different Surgeons on Dental Implant Failure.
Chrcanovic, Bruno Ramos; Kisch, Jenö; Albrektsson, Tomas; Wennerberg, Ann
To assess the influence of several factors on the prevalence of dental implant failure, with special consideration of the placement of implants by different dental surgeons. This retrospective study is based on 2,670 patients who received 10,096 implants at one specialist clinic. Only the data of patients and implants treated by surgeons who had inserted a minimum of 200 implants at the clinic were included. Kaplan-Meier curves were stratified with respect to the individual surgeon. A generalized estimating equation (GEE) method was used to account for the fact that repeated observations (several implants) were placed in a single patient. The factors bone quantity, bone quality, implant location, implant surface, and implant system were analyzed with descriptive statistics separately for each individual surgeon. A total of 10 surgeons were eligible. The differences between the survival curves of each individual were statistically significant. The multivariate GEE model showed the following variables to be statistically significant: surgeon, bruxism, intake of antidepressants, location, implant length, and implant system. The surgeon with the highest absolute number of failures was also the one who inserted the most implants in sites of poor bone and used turned implants in most cases, whereas the surgeon with the lowest absolute number of failures used mainly modern implants. Separate survival analyses of turned and modern implants stratified for the individual surgeon showed statistically significant differences in cumulative survival. Different levels of failure incidence could be observed between the surgeons, occasionally reaching significant levels. Although a direct causal relationship could not be ascertained, the results of the present study suggest that the surgeons' technique, skills, and/or judgment may negatively influence implant survival rates.
Applications of optical coherence tomography in the non-contact assessment of automotive paints
NASA Astrophysics Data System (ADS)
Lawman, Samuel; Zhang, Jinke; Williams, Bryan M.; Zheng, Yalin; Shen, Yao-Chun
2017-06-01
The multiple layer paint systems on modern cars serve two end purposes, they firstly protect against corrosion and secondly give the desired visual appearance. To ensure consistent corrosion protection and appearance, suitable Quality Assurance (QA) measures on the final product are required. Various (layer thickness and consistency, layer composition, flake statistics, surface profile and layer dryness) parameters are of importance, each with specific techniques that can measure one or some of them but no technique that can measure all or most of them. Optical Coherence Tomography (OCT) is a 3D imaging technique with micrometre resolution. Since 2016, OCT measurements of layer thickness and consistency, layer composition fingerprint and flake statistics have been reported. In this paper we demonstrate two more novel applications of OCT to automotive paints. Firstly, we use OCT to quantify unwanted surface texture, which leads to an "orange peel" visual defect. This was done by measuring the surface profiles of automotive paints, with an unoptimised precision of 37 nm over lateral range of 7 mm, to quantify texture of less than 500 nm. Secondly, we demonstrate that OCT can measure how dry a coating layer is by measuring how fast it is still shrinking quasiinstantaneously, using Fourier phase sensitivity.
NASA Astrophysics Data System (ADS)
Schulz, Hans Martin; Thies, Boris; Chang, Shih-Chieh; Bendix, Jörg
2016-03-01
The mountain cloud forest of Taiwan can be delimited from other forest types using a map of the ground fog frequency. In order to create such a frequency map from remotely sensed data, an algorithm able to detect ground fog is necessary. Common techniques for ground fog detection based on weather satellite data cannot be applied to fog occurrences in Taiwan as they rely on several assumptions regarding cloud properties. Therefore a new statistical method for the detection of ground fog in mountainous terrain from MODIS Collection 051 data is presented. Due to the sharpening of input data using MODIS bands 1 and 2, the method provides fog masks in a resolution of 250 m per pixel. The new technique is based on negative correlations between optical thickness and terrain height that can be observed if a cloud that is relatively plane-parallel is truncated by the terrain. A validation of the new technique using camera data has shown that the quality of fog detection is comparable to that of another modern fog detection scheme developed and validated for the temperate zones. The method is particularly applicable to optically thinner water clouds. Beyond a cloud optical thickness of ≈ 40, classification errors significantly increase.
Worku, Abebaw Gebeyehu; Tessema, Gizachew Assefa; Zeleke, Atinkut Alamirrew
2015-01-01
Introduction Accessing family planning can reduce a significant proportion of maternal, infant, and childhood deaths. In Ethiopia, use of modern contraceptive methods is low but it is increasing. This study aimed to analyze the trends and determinants of changes in modern contraceptive use over time among young married women in Ethiopia. Methods The study used data from the three Demographic Health Surveys conducted in Ethiopia, in 2000, 2005, and 2011. Young married women age 15–24 years with sample sizes of 2,157 in 2000, 1,904 in 2005, and 2,146 in 2011 were included. Logit-based decomposition analysis technique was used for analysis of factors contributing to the recent changes. STATA 12 was employed for data management and analyses. All calculations presented in this paper were weighted for the sampling probabilities and non-response. Complex sampling procedures were also considered during testing of statistical significance. Results Among young married women, modern contraceptive prevalence increased from 6% in 2000 to 16% in 2005 and to 36% in 2011. The decomposition analysis indicated that 34% of the overall change in modern contraceptive use was due to difference in women’s characteristics. Changes in the composition of young women’s characteristics according to age, educational status, religion, couple concordance on family size, and fertility preference were the major sources of this increase. Two-thirds of the increase in modern contraceptive use was due to difference in coefficients. Most importantly, the increase was due to change in contraceptive use behavior among the rural population (33%) and among Orthodox Christians (16%) and Protestants (4%). Conclusions Modern contraceptive use among young married women has showed a remarkable increase over the last decade in Ethiopia. Programmatic interventions targeting poor, younger (adolescent), illiterate, and Muslim women would help to maintain the increasing trend in modern contraceptive use. PMID:25635389
ERIC Educational Resources Information Center
Gambrill, Eileen
2014-01-01
The "Diagnostic and Statistical Manual of Mental Disorders" (DSM) is one of the most successful technologies in modern times. In spite of well-argued critiques, the DSM and the idea of "mental illness" on which it is based flourish, with ever more (mis)behaviors labeled as brain diseases. Problems in living and related distress…
cudaMap: a GPU accelerated program for gene expression connectivity mapping.
McArt, Darragh G; Bankhead, Peter; Dunne, Philip D; Salto-Tellez, Manuel; Hamilton, Peter; Zhang, Shu-Dong
2013-10-11
Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Emerging 'omics' technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap.
Flexible use and technique extension of logistics management
NASA Astrophysics Data System (ADS)
Xiong, Furong
2011-10-01
As we all know, the origin of modern logistics was in the United States, developed in Japan, became mature in Europe, and expanded in China. This is a historical development of the modern logistics recognized track. Due to China's economic and technological development, and with the construction of Shanghai International Shipping Center and Shanghai Yangshan International Deepwater development, China's modern logistics industry will attain a leap-forward development of a strong pace, and will also catch up with developed countries in the Western modern logistics level. In this paper, the author explores the flexibility of China's modern logistics management techniques to extend the use, and has certain practical and guidance significances.
Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel
2017-06-15
Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.
Data mining: childhood injury control and beyond.
Tepas, Joseph J
2009-08-01
Data mining is defined as the automatic extraction of useful, often previously unknown information from large databases or data sets. It has become a major part of modern life and is extensively used in industry, banking, government, and health care delivery. The process requires a data collection system that integrates input from multiple sources containing critical elements that define outcomes of interest. Appropriately designed data mining processes identify and adjust for confounding variables. The statistical modeling used to manipulate accumulated data may involve any number of techniques. As predicted results are periodically analyzed against those observed, the model is consistently refined to optimize precision and accuracy. Whether applying integrated sources of clinical data to inferential probabilistic prediction of risk of ventilator-associated pneumonia or population surveillance for signs of bioterrorism, it is essential that modern health care providers have at least a rudimentary understanding of what the concept means, how it basically works, and what it means to current and future health care.
[Attitude of pregnant women towards labour--study of forms of preparation and preferences].
Kosińska, Katarzyna; Krychowska, Alina; Wielgoś, Mirosław; Myszewska, Aleksandra; Przyboś, Andrzej
2005-12-01
The aim of this study was to assess the knowledge of alternative delivery techniques among pregnant women and their preferences concerning the course of labour. 275 woman hospitalizated in obstetric wards in Puck and Ist Clinic in Warsaw were questionnaired in the period of July 2003 - February 2004. The mean age of women was 26 +/- 4.9. 55.7% of them were nulliparous, 44.3% multiparous. T-Student test was used for statistical analysis. The majority of questionnaired women knew alternative positions during delivery and possible analgetic techniques. 25.1% of women attended labour school. 81.2% wanted to give birth in the hospital, 10% at home and 8.8% in the delivery room. 51.1% preferred waterbirth and 22.5% obstetric chair--most of them came from the big cities, were better educated and attended labour school. Almost half of all women are in favour of epidural anaesthesia of delivery. Caesarean section on request was supported by 13.8%. For 67.4% the presence of intimates during labour was important. Labour school has a significant influence on women's knowledge and their preferences. Waterbirth and other modern delivery techniques are very popular among better educated women from big cities, while those with lower education from small cities and villages prefer "classic" labour. Therefore promotion of modern delivery methods and active participation in labour should be concentrated on these groups of women. Nowadays obstetric departments should ensure not only safety of giving birth but also complete personal comfort for pregnant women.
State of the art in treatment of facial paralysis with temporalis tendon transfer.
Sidle, Douglas M; Simon, Patrick
2013-08-01
Temporalis tendon transfer is a technique for dynamic facial reanimation. Since its inception, nearly 80 years ago, it has undergone a wealth of innovation to produce the modern operation. The purpose of this review is to update the literature as to the current techniques and perioperative management of patients undergoing temporalis tendon transfer. The modern technique focuses on the minimally invasive approaches and aesthetic refinements to enhance the final product of the operation. The newest techniques as well as preoperative assessment and postoperative rehabilitation are discussed. When temporalis tendon transfer is indicated for facial reanimation, the modern operation offers a refined technique that produces an aesthetically acceptable outcome. Preoperative smile assessment and postoperative smile rehabilitation are necessary and are important adjuncts to a successful operation.
Network science in Egyptology.
Coulombe, Patrick; Qualls, Clifford; Kruszynski, Robert; Nerlich, Andreas; Bianucci, Raffaella; Harris, Richard; Mermier, Christine; Appenzeller, Otto
2012-01-01
Egyptology relies on traditional descriptive methods. Here we show that modern, Internet-based science and statistical methods can be applied to Egyptology. Two four-thousand-year-old sarcophagi in one tomb, one within the other, with skeletal remains of a woman, gave us the opportunity to diagnose a congenital nervous system disorder in the absence of a living nervous system. The sarcophagi were discovered near Thebes, Egypt. They were well preserved and meticulously restored. The skeletal remains suggested that the woman, aged between 50 and 60 years, was Black, possibly of Nubian descent and suffered from syringobulbia, a congenital cyst in the brain stem and upper spinal cord. We employed crowd sourcing, the anonymous responses of 204 Facebook users who performed a matching task of living persons' iris color with iris color of the Udjat eyes, a decoration found on Egyptian sarcophagi, to confirm the ethnicities of the sarcophagus occupants. We used modern fMRI techniques to illustrate the putative extent of her lesion in the brain stem and upper spinal cord deduced from her skeletal remains. We compared, statistically, the right/left ratios, a non-dimensional number, of the orbit height, orbit width, malar height and the infraorbital foramena with the same measures obtained from 32 ancient skulls excavated from the Fayum, North of Thebes. We found that these ratios were significantly different in this skull indicating atrophy of cranial bones on the left. In this instance, Internet science and the use of modern neurologic research tools showed that ancient sarcophagus makers shaped and decorated their wares to fit the ethnicity of the prospective occupants of the sarcophagi. We also showed that, occasionally, human nervous system disease may be recognizable in the absence of a living nervous system.
Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio
2010-01-01
The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and "supporter" voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields.
Vecchiato, Giovanni; Astolfi, Laura; Tabarrini, Alessandro; Salinari, Serenella; Mattia, Donatella; Cincotti, Febo; Bianchi, Luigi; Sorrentino, Domenica; Aloise, Fabio; Soranzo, Ramon; Babiloni, Fabio
2010-01-01
The use of modern brain imaging techniques could be useful to understand what brain areas are involved in the observation of video clips related to commercial advertising, as well as for the support of political campaigns, and also the areas of Public Service Announcements (PSAs). In this paper we describe the capability of tracking brain activity during the observation of commercials, political spots, and PSAs with advanced high-resolution EEG statistical techniques in time and frequency domains in a group of normal subjects. We analyzed the statistically significant cortical spectral power activity in different frequency bands during the observation of a commercial video clip related to the use of a beer in a group of 13 normal subjects. In addition, a TV speech of the Prime Minister of Italy was analyzed in two groups of swing and “supporter” voters. Results suggested that the cortical activity during the observation of commercial spots could vary consistently across the spot. This fact suggest the possibility to remove the parts of the spot that are not particularly attractive by using those cerebral indexes. The cortical activity during the observation of the political speech indicated a major cortical activity in the supporters group when compared to the swing voters. In this case, it is possible to conclude that the communication proposed has failed to raise attention or interest on swing voters. In conclusions, high-resolution EEG statistical techniques have been proved to able to generate useful insights about the particular fruition of TV messages, related to both commercial as well as political fields. PMID:20069055
An introduction to real-time graphical techniques for analyzing multivariate data
NASA Astrophysics Data System (ADS)
Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner
1987-08-01
Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".
The use of applied software for the professional training of students studying humanities
NASA Astrophysics Data System (ADS)
Sadchikova, A. S.; Rodin, M. M.
2017-01-01
Research practice is an integral part of humanities students' training process. In this regard the training process is to include modern information techniques of the training process of students studying humanities. This paper examines the most popular applied software products used for data processing in social science. For testing purposes we selected the most commonly preferred professional packages: MS Excel, IBM SPSS Statistics, STATISTICA, STADIA. Moreover the article contains testing results of a specialized software Prikladnoy Sotsiolog that is applicable for the preparation stage of the research. The specialised software were tested during one term in groups of students studying humanities.
Zeng, Irene Sui Lan; Lumley, Thomas
2018-01-01
Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.
Dagnew, Amare Belachew; Tewabe, Tilahun; Murugan, Rajalakshmi
2018-05-29
Health seeking behavior is an action taken by an individual who perceive to have a health problem. In most developing countries including Ethiopia the health of the children is strongly dependant on maternal health care behavior. Most childhood morbidities and mortalities are associated with low level of mothers health care seeking behavior. Therefore, the objective of this study was to assess level of modern health care seeking behavior among mothers having under five children in Dangila town, North West Ethiopia. Community based quantitative cross-sectional study was conducted from April 15 to May 15, 2016. Systematic random sampling technique was used to select study participants. A total of273 mothers with children less than five years were included in this study. The data was collected from all five Kebeles using interviewer administered questionnaire. Descriptive and inferential statistics were used to present the data. Both bivariate and multivariate logistic regression analyses were used to identify factors associated with level of modern health care seeking behavior. Prevalence of modern health care seeking behavior was 82.1%. Age of mothers (AOR = 2.4(1.1, 5.3), age of the child (AOR = 6.7(2.8, 22.2), severity of illness (AOR = 5.2(1.2, 22.6) and family number (AOR = 6.4(2.1, 20.2) were predictors of modern health care seeking behavior among mothers. Majority of the mothers preferred to take their children to modern health care when they got illness. Age of children, age of mother, number of family and severity of illness were the determinant factors for modern health care seeking behavior. Therefore, health care services should be strengthened at community level through community integrated management of childhood illness, information, education communication / behavioral change communication strategies to improve mothers health care seeking behaviors.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis
Steele, Joe; Bastola, Dhundy
2014-01-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base–base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel–Ziv techniques from data compression. PMID:23904502
Woods and Russell, Hill, and the emergence of medical statistics
Farewell, Vern; Johnson, Tony
2010-01-01
In 1937, Austin Bradford Hill wrote Principles of Medical Statistics (Lancet: London, 1937) that became renowned throughout the world and is widely associated with the birth of modern medical statistics. Some 6 years earlier Hilda Mary Woods and William Thomas Russell, colleagues of Hill at the London School of Hygiene and Tropical Medicine, wrote a similar book An Introduction to Medical Statistics (PS King and Son: London, 1931) that is little known today. We trace the origins of these two books from the foundations of early demography and vital statistics, and make a detailed examination of some of their chapters. It is clear that these texts mark a watershed in the history of medical statistics that demarcates the vital statistics of the nineteenth and early twentieth centuries from the modern discipline. Moreover, we consider that the book by Woods and Russell is of some importance in the development of medical statistics and we describe and acknowledge their place in the history of this discipline. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20535761
Woods and Russell, Hill, and the emergence of medical statistics.
Farewell, Vern; Johnson, Tony
2010-06-30
In 1937, Austin Bradford Hill wrote Principles of Medical Statistics (Lancet: London, 1937) that became renowned throughout the world and is widely associated with the birth of modern medical statistics. Some 6 years earlier Hilda Mary Woods and William Thomas Russell, colleagues of Hill at the London School of Hygiene and Tropical Medicine, wrote a similar book An Introduction to Medical Statistics (PS King and Son: London, 1931) that is little known today. We trace the origins of these two books from the foundations of early demography and vital statistics, and make a detailed examination of some of their chapters. It is clear that these texts mark a watershed in the history of medical statistics that demarcates the vital statistics of the nineteenth and early twentieth centuries from the modern discipline. Moreover, we consider that the book by Woods and Russell is of some importance in the development of medical statistics and we describe and acknowledge their place in the history of this discipline. (c) 2010 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Broothaerts, Nils; López-Sáez, José Antonio; Verstraeten, Gert
2017-04-01
Reconstructing and quantifying human impact is an important step to understand human-environment interactions in the past. Quantitative measures of human impact on the landscape are needed to fully understand long-term influence of anthropogenic land cover changes on the global climate, ecosystems and geomorphic processes. Nevertheless, quantifying past human impact is not straightforward. Recently, multivariate statistical analysis of fossil pollen records have been proposed to characterize vegetation changes and to get insights in past human impact. Although statistical analysis of fossil pollen data can provide useful insights in anthropogenic driven vegetation changes, still it cannot be used as an absolute quantification of past human impact. To overcome this shortcoming, in this study fossil pollen records were included in a multivariate statistical analysis (cluster analysis and non-metric multidimensional scaling (NMDS)) together with modern pollen data and modern vegetation data. The information on the modern pollen and vegetation dataset can be used to get a better interpretation of the representativeness of the fossil pollen records, and can result in a full quantification of human impact in the past. This methodology was applied in two contrasting environments: SW Turkey and Central Spain. For each region, fossil pollen data from different study sites were integrated, together with modern pollen data and information on modern vegetation. In this way, arboreal cover, grazing pressure and agricultural activities in the past were reconstructed and quantified. The data from SW Turkey provides new integrated information on changing human impact through time in the Sagalassos territory, and shows that human impact was most intense during the Hellenistic and Roman Period (ca. 2200-1750 cal a BP) and decreased and changed in nature afterwards. The data from central Spain shows for several sites that arboreal cover decreases bellow 5% from the Feudal period onwards (ca. 850 cal a BP) related to increasing human impact in the landscape. At other study sites arboreal cover remained above 25% beside significant human impact. Overall, the presented examples from two contrasting environments shows how cluster analysis and NMDS of modern and fossil pollen data can help to provide quantitative insights in anthropogenic land cover changes. Our study extensively discuss and illustrate the possibilities and limitations of statistical analysis of pollen data to quantify human induced land use changes.
van der Ploeg, Tjeerd; Austin, Peter C; Steyerberg, Ewout W
2014-12-22
Modern modelling techniques may potentially provide more accurate predictions of binary outcomes than classical techniques. We aimed to study the predictive performance of different modelling techniques in relation to the effective sample size ("data hungriness"). We performed simulation studies based on three clinical cohorts: 1282 patients with head and neck cancer (with 46.9% 5 year survival), 1731 patients with traumatic brain injury (22.3% 6 month mortality) and 3181 patients with minor head injury (7.6% with CT scan abnormalities). We compared three relatively modern modelling techniques: support vector machines (SVM), neural nets (NN), and random forests (RF) and two classical techniques: logistic regression (LR) and classification and regression trees (CART). We created three large artificial databases with 20 fold, 10 fold and 6 fold replication of subjects, where we generated dichotomous outcomes according to different underlying models. We applied each modelling technique to increasingly larger development parts (100 repetitions). The area under the ROC-curve (AUC) indicated the performance of each model in the development part and in an independent validation part. Data hungriness was defined by plateauing of AUC and small optimism (difference between the mean apparent AUC and the mean validated AUC <0.01). We found that a stable AUC was reached by LR at approximately 20 to 50 events per variable, followed by CART, SVM, NN and RF models. Optimism decreased with increasing sample sizes and the same ranking of techniques. The RF, SVM and NN models showed instability and a high optimism even with >200 events per variable. Modern modelling techniques such as SVM, NN and RF may need over 10 times as many events per variable to achieve a stable AUC and a small optimism than classical modelling techniques such as LR. This implies that such modern techniques should only be used in medical prediction problems if very large data sets are available.
Barri, Fernando
2018-01-01
Guanacos (Lama guanicoe) are large ungulates that have been valued by human populations in South America since the Late Pleistocene. Even though they were very abundant until the end of the 19th century (before the high deforestation rate of the last decades), guanacos have nearly disappeared in the Gran Chaco ecoregion, with relicts and isolated populations surviving in some areas, such as the shrubland area near the saline depressions of Córdoba province, Argentina. In this report, we present the first data from a locally endangered guanaco wild population, through the study of skeletal remains recovered in La Providencia ranch. Our results showed that most of the elements belonged to adults aged between 36 and 96 months; sex evaluation showed similar numbers of males and females. Statistical analysis of the body size of modern samples from Córdoba demonstrated that guanacos from the Chaco had large dimensions and presented lower size variability than the modern and archaeological specimens in our database. Moreover, they exhibited dimensions similar to those of modern guanacos from Patagonia and San Juan, and to archaeological specimens from Ongamira and Cerro Colorado, although further genetic studies are needed to corroborate a possible phylogenetic relationship. Finally, we used archaeozoological techniques to provide a first characterization of a relict guanaco population from the Chaco ecoregion, demonstrating its value to the study of modern skeletal remains and species conservation biology. PMID:29641579
Problematizing Statistical Literacy: An Intersection of Critical and Statistical Literacies
ERIC Educational Resources Information Center
Weiland, Travis
2017-01-01
In this paper, I problematize traditional notions of statistical literacy by juxtaposing it with critical literacy. At the school level statistical literacy is vitally important for students who are preparing to become citizens in modern societies that are increasingly shaped and driven by data based arguments. The teaching of statistics, which is…
Teaching Statistics Online: A Decade's Review of the Literature about What Works
ERIC Educational Resources Information Center
Mills, Jamie D.; Raju, Dheeraj
2011-01-01
A statistics course can be a very challenging subject to teach. To enhance learning, today's modern course in statistics might incorporate many different aspects of technology. Due to advances in technology, teaching statistics online has also become a popular course option. Although researchers are studying how to deliver statistics courses in…
A Survey of Image Encryption Algorithms
NASA Astrophysics Data System (ADS)
Kumari, Manju; Gupta, Shailender; Sardana, Pranshul
2017-12-01
Security of data/images is one of the crucial aspects in the gigantic and still expanding domain of digital transfer. Encryption of images is one of the well known mechanisms to preserve confidentiality of images over a reliable unrestricted public media. This medium is vulnerable to attacks and hence efficient encryption algorithms are necessity for secure data transfer. Various techniques have been proposed in literature till date, each have an edge over the other, to catch-up to the ever growing need of security. This paper is an effort to compare the most popular techniques available on the basis of various performance metrics like differential, statistical and quantitative attacks analysis. To measure the efficacy, all the modern and grown-up techniques are implemented in MATLAB-2015. The results show that the chaotic schemes used in the study provide highly scrambled encrypted images having uniform histogram distribution. In addition, the encrypted images provided very less degree of correlation coefficient values in horizontal, vertical and diagonal directions, proving their resistance against statistical attacks. In addition, these schemes are able to resist differential attacks as these showed a high sensitivity for the initial conditions, i.e. pixel and key values. Finally, the schemes provide a large key spacing, hence can resist the brute force attacks, and provided a very less computational time for image encryption/decryption in comparison to other schemes available in literature.
Research on an innovative design model
NASA Astrophysics Data System (ADS)
Fu, Y.; Fang, H.
2018-03-01
The design methods of furniture are different from east to west; it has been the hotspot of the scholars. However, in terms of the theory of modern design innovation, neither the early creation theory, the modern design theory, nor the widely applied TRIZ theory can fully fit the modern furniture design innovation, so it is urgent to study the modern furniture design theory. This paper is based on the idea of TRIZ theory, using lots of literatures as data, and uses the method of statistical stratification to analyze and sort out the research of modern sitting equipment, and finally put forward the modern furniture design model, which provides new ideas and perspectives for the modern design of Chinese furniture.
An Introduction to Modern Missing Data Analyses
ERIC Educational Resources Information Center
Baraldi, Amanda N.; Enders, Craig K.
2010-01-01
A great deal of recent methodological research has focused on two modern missing data analysis methods: maximum likelihood and multiple imputation. These approaches are advantageous to traditional techniques (e.g. deletion and mean imputation techniques) because they require less stringent assumptions and mitigate the pitfalls of traditional…
Population growth rates: issues and an application.
Godfray, H Charles J; Rees, Mark
2002-01-01
Current issues in population dynamics are discussed in the context of The Royal Society Discussion Meeting 'Population growth rate: determining factors and role in population regulation'. In particular, different views on the centrality of population growth rates to the study of population dynamics and the role of experiments and theory are explored. Major themes emerging include the role of modern statistical techniques in bringing together experimental and theoretical studies, the importance of long-term experimentation and the need for ecology to have model systems, and the value of population growth rate as a means of understanding and predicting population change. The last point is illustrated by the application of a recently introduced technique, integral projection modelling, to study the population growth rate of a monocarpic perennial plant, its elasticities to different life-history components and the evolution of an evolutionarily stable strategy size at flowering. PMID:12396521
Uncertainty Management for Diagnostics and Prognostics of Batteries using Bayesian Techniques
NASA Technical Reports Server (NTRS)
Saha, Bhaskar; Goebel, kai
2007-01-01
Uncertainty management has always been the key hurdle faced by diagnostics and prognostics algorithms. A Bayesian treatment of this problem provides an elegant and theoretically sound approach to the modern Condition- Based Maintenance (CBM)/Prognostic Health Management (PHM) paradigm. The application of the Bayesian techniques to regression and classification in the form of Relevance Vector Machine (RVM), and to state estimation as in Particle Filters (PF), provides a powerful tool to integrate the diagnosis and prognosis of battery health. The RVM, which is a Bayesian treatment of the Support Vector Machine (SVM), is used for model identification, while the PF framework uses the learnt model, statistical estimates of noise and anticipated operational conditions to provide estimates of remaining useful life (RUL) in the form of a probability density function (PDF). This type of prognostics generates a significant value addition to the management of any operation involving electrical systems.
NASA Astrophysics Data System (ADS)
Lawman, A. E.; Quinn, T. M.; Partin, J. W.; Taylor, F. W.; Thirumalai, K.; WU, C. C.; Shen, C. C.
2017-12-01
The Medieval Climate Anomaly (MCA: 950-1250 CE) is identified as a period during the last 2 millennia with Northern Hemisphere surface temperatures similar to the present. However, our understanding of tropical climate variability during the MCA is poorly constrained due to a lack of sub-annually resolved proxy records. We investigate seasonal and interannual variability during the MCA using geochemical records developed from two well preserved Porites lutea fossilized corals from the tropical southwest Pacific (Tasmaloum, Vanuatu; 15.6°S, 166.9°E). Absolute U/Th dates of 1127.1 ± 2.7 CE and 1105.1 ± 3.0 CE indicate that the selected fossil corals lived during the MCA. We use paired coral Sr/Ca and δ18O measurements to reconstruct sea surface temperature (SST) and the δ18O of seawater (a proxy for salinity). To provide context for the fossil coral records and test whether the mean state and climate variability at Vanuatu during the MCA is similar to the modern climate, our analysis also incorporates two modern coral records from Sabine Bank (15.9°S, 166.0°E) and Malo Channel (15.7°S, 167.2°E), Vanuatu for comparison. We quantify the uncertainty in our modern and fossil coral SST estimates via replication with multiple, overlapping coral records. Both the modern and fossil corals reproduce their respective mean SST value over their common period of overlap, which is 25 years in both cases. Based on over 100 years of monthly Sr/Ca data from each time period, we find that SSTs at Vanuatu during the MCA are 1.3 ± 0.7°C cooler relative to the modern. We also find that the median amplitude of the annual cycle is 0.8 ± 0.3°C larger during the MCA relative to the modern. Multiple data analysis techniques, including the standard deviation and the difference between the 95th and 5th percentiles of the annual SST cycle estimates, also show that the MCA has greater annual SST variability relative to the modern. Stable isotope data acquisition is ongoing, and when complete we will have a suite of records of paired coral Sr/Ca and δ18O measurements. We will apply similar statistical techniques developed for the Sr/Ca-SST record to also investigate variability in the δ18O of seawater (salinity). Modern salinity variability at Vanuatu arises due to hydrological anomalies associated with the El Niño-Southern Oscillation in the tropical Pacific.
Statistics: Can We Get beyond Terminal?
ERIC Educational Resources Information Center
Green, Suzy; Carney, JoLynn V.
Recent articles in behavioral sciences statistics literature address the need for modernizing graduate statistics programs and courses. This paper describes the development of one such course and evaluates student background for a class designed to provide a more consumer-oriented type of statistics instruction by focusing on the needs of students…
Yoga and mental health: A dialogue between ancient wisdom and modern psychology
Vorkapic, Camila Ferreira
2016-01-01
Background: Many yoga texts make reference to the importance of mental health and the use of specific techniques in the treatment of mental disorders. Different concepts utilized in modern psychology may not come with contemporary ideas, instead, they seem to share a common root with ancient wisdom. Aims: The goal of this perspective article is to correlate modern techniques used in psychology and psychiatry with yogic practices, in the treatment of mental disorders. Materials and Methods: The current article presented a dialogue between the yogic approach for the treatment of mental disorder and concepts used in modern psychology, such as meta-cognition, disidentification, deconditioning and interoceptive exposure. Conclusions: Contemplative research found out that modern interventions in psychology might not come from modern concepts after all, but share great similarity with ancient yogic knowledge, giving us the opportunity to integrate the psychological wisdom of both East and West. PMID:26865774
Advances in Modern Botnet Understanding and the Accurate Enumeration of Infected Hosts
ERIC Educational Resources Information Center
Nunnery, Christopher Edward
2011-01-01
Botnets remain a potent threat due to evolving modern architectures, inadequate remediation methods, and inaccurate measurement techniques. In response, this research exposes the architectures and operations of two advanced botnets, techniques to enumerate infected hosts, and pursues the scientific refinement of infected-host enumeration data by…
Experiments with recursive estimation in astronomical image processing
NASA Technical Reports Server (NTRS)
Busko, I.
1992-01-01
Recursive estimation concepts were applied to image enhancement problems since the 70's. However, very few applications in the particular area of astronomical image processing are known. These concepts were derived, for 2-dimensional images, from the well-known theory of Kalman filtering in one dimension. The historic reasons for application of these techniques to digital images are related to the images' scanned nature, in which the temporal output of a scanner device can be processed on-line by techniques borrowed directly from 1-dimensional recursive signal analysis. However, recursive estimation has particular properties that make it attractive even in modern days, when big computer memories make the full scanned image available to the processor at any given time. One particularly important aspect is the ability of recursive techniques to deal with non-stationary phenomena, that is, phenomena which have their statistical properties variable in time (or position in a 2-D image). Many image processing methods make underlying stationary assumptions either for the stochastic field being imaged, for the imaging system properties, or both. They will underperform, or even fail, when applied to images that deviate significantly from stationarity. Recursive methods, on the contrary, make it feasible to perform adaptive processing, that is, to process the image by a processor with properties tuned to the image's local statistical properties. Recursive estimation can be used to build estimates of images degraded by such phenomena as noise and blur. We show examples of recursive adaptive processing of astronomical images, using several local statistical properties to drive the adaptive processor, as average signal intensity, signal-to-noise and autocorrelation function. Software was developed under IRAF, and as such will be made available to interested users.
Kriz, J; Baues, C; Engenhart-Cabillic, R; Haverkamp, U; Herfarth, K; Lukas, P; Schmidberger, H; Marnitz-Schulze, S; Fuchs, M; Engert, A; Eich, H T
2017-02-01
Field design changed substantially from extended-field RT (EF-RT) to involved-field RT (IF-RT) and now to involved-node RT (IN-RT) and involved-site RT (IS-RT) as well as treatment techniques in radiotherapy (RT) of Hodgkin's lymphoma (HL). The purpose of this article is to demonstrate the establishment of a quality assurance program (QAP) including modern RT techniques and field designs within the German Hodgkin Study Group (GHSG). In the era of modern conformal RT, this QAP had to be fundamentally adapted and a new evaluation process has been intensively discussed by the radiotherapeutic expert panel of the GHSG. The expert panel developed guidelines and criteria to analyse "modern" field designs and treatment techniques. This work is based on a dataset of 11 patients treated within the sixth study generation (HD16-17). To develop a QAP of "modern RT", the expert panel defined criteria for analysing current RT procedures. The consensus of a modified QAP in ongoing and future trials is presented. With this schedule, the QAP of the GHSG could serve as a model for other study groups.
Multivariate analysis techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bendavid, Josh; Fisher, Wade C.; Junk, Thomas R.
2016-01-01
The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually bothmore » be improved by separating signal events from background events with higher efficiency and purity.« less
Professional Competence of a Teacher in Higher Educational Institution
ERIC Educational Resources Information Center
Abykanova, Bakytgul; Tashkeyeva, Gulmira; Idrissov, Salamat; Bilyalova, Zhupar; Sadirbekova, Dinara
2016-01-01
Modern reality brings certain corrections to the understanding of forms and methods of teaching various courses in higher educational institution. A special role among the educational techniques and means in the college educational environment is taken by the modern technologies, such as using the techniques, means and ways, which are aimed at…
An Investigative Graduate Laboratory Course for Teaching Modern DNA Techniques
ERIC Educational Resources Information Center
de Lencastre, Alexandre; Torello, A. Thomas; Keller, Lani C.
2017-01-01
This graduate-level DNA methods laboratory course is designed to model a discovery-based research project and engages students in both traditional DNA analysis methods and modern recombinant DNA cloning techniques. In the first part of the course, students clone the "Drosophila" ortholog of a human disease gene of their choosing using…
ERIC Educational Resources Information Center
Fitzgerald, Mary
2017-01-01
This article reflects on the ways in which socially engaged arts practices can contribute to reconceptualizing the contemporary modern dance technique class as a powerful site of social change. Specifically, the author considers how incorporating socially engaged practices into pedagogical models has the potential to foster responsible citizenship…
New Trends in E-Science: Machine Learning and Knowledge Discovery in Databases
NASA Astrophysics Data System (ADS)
Brescia, Massimo
2012-11-01
Data mining, or Knowledge Discovery in Databases (KDD), while being the main methodology to extract the scientific information contained in Massive Data Sets (MDS), needs to tackle crucial problems since it has to orchestrate complex challenges posed by transparent access to different computing environments, scalability of algorithms, reusability of resources. To achieve a leap forward for the progress of e-science in the data avalanche era, the community needs to implement an infrastructure capable of performing data access, processing and mining in a distributed but integrated context. The increasing complexity of modern technologies carried out a huge production of data, whose related warehouse management and the need to optimize analysis and mining procedures lead to a change in concept on modern science. Classical data exploration, based on local user own data storage and limited computing infrastructures, is no more efficient in the case of MDS, worldwide spread over inhomogeneous data centres and requiring teraflop processing power. In this context modern experimental and observational science requires a good understanding of computer science, network infrastructures, Data Mining, etc. i.e. of all those techniques which fall into the domain of the so called e-science (recently assessed also by the Fourth Paradigm of Science). Such understanding is almost completely absent in the older generations of scientists and this reflects in the inadequacy of most academic and research programs. A paradigm shift is needed: statistical pattern recognition, object oriented programming, distributed computing, parallel programming need to become an essential part of scientific background. A possible practical solution is to provide the research community with easy-to understand, easy-to-use tools, based on the Web 2.0 technologies and Machine Learning methodology. Tools where almost all the complexity is hidden to the final user, but which are still flexible and able to produce efficient and reliable scientific results. All these considerations will be described in the detail in the chapter. Moreover, examples of modern applications offering to a wide variety of e-science communities a large spectrum of computational facilities to exploit the wealth of available massive data sets and powerful machine learning and statistical algorithms will be also introduced.
Mohammed, Abdurahman; Woldeyohannes, Desalegn; Feleke, Amsalu; Megabiaw, Berihun
2014-02-03
Ethiopia is the second most populous country in Africa with high fertility and fast population growth rate. It is also one of the countries with high maternal and child mortality rate in sub-Saharan Africa Family planning is a crucial strategy to halt the fast population growth, to reduce child mortality and improve maternal health (Millennium Development Goal 4 and 5). Therefore, this study aimed to assess the prevalence and determinants of modern contraceptive utilization among married women of reproductive age group. A community based cross-sectional study was conducted from August 15 to September 1, 2010 among married women aged 15-49 years in Debre Birhan District. Multistage sampling technique was used to select a total of 851 study participants. A pre-tested structured questionnaire was used for gathering data. Bivariate and multivariate logistic regression analyses were performed using SPSS version 16.0 statistical package. Modern contraceptive prevalence rate among currently married women was 46.9%. Injectable contraceptives were the most frequently used methods (62.9%), followed by intrauterine device (16.8%), pills (14%), norplant (4.3%), male condom (1.2%) and female sterilization (0.8%). Multiple logistic regression model revealed that the need for more children (AOR 9.27, 95% CI 5.43-15.84), husband approve (AOR 2.82, 95% CI 1.67-4.80), couple's discussion about family planning issues (AOR 7.32, 95% CI 3.60-14.86). Similarly, monthly family income and number of living children were significantly associated with the use of modern contraceptives. Modern contraceptive use was high in the district. Couple's discussion and husband approval of contraceptives use were significantly associated with the use of modern contraceptives. Therefore, district health office and concerned stakeholders should focus on couples to encourage communication and male involvement for family planning.
2014-01-01
Background Ethiopia is the second most populous country in Africa with high fertility and fast population growth rate. It is also one of the countries with high maternal and child mortality rate in sub-Saharan Africa Family planning is a crucial strategy to halt the fast population growth, to reduce child mortality and improve maternal health (Millennium Development Goal 4 and 5). Therefore, this study aimed to assess the prevalence and determinants of modern contraceptive utilization among married women of reproductive age group. Methods A community based cross-sectional study was conducted from August 15 to September 1, 2010 among married women aged 15–49 years in Debre Birhan District. Multistage sampling technique was used to select a total of 851 study participants. A pre-tested structured questionnaire was used for gathering data. Bivariate and multivariate logistic regression analyses were performed using SPSS version 16.0 statistical package. Results Modern contraceptive prevalence rate among currently married women was 46.9%. Injectable contraceptives were the most frequently used methods (62.9%), followed by intrauterine device (16.8%), pills (14%), norplant (4.3%), male condom (1.2%) and female sterilization (0.8%). Multiple logistic regression model revealed that the need for more children (AOR 9.27, 95% CI 5.43-15.84), husband approve (AOR 2.82, 95% CI 1.67-4.80), couple’s discussion about family planning issues (AOR 7.32, 95% CI 3.60-14.86). Similarly, monthly family income and number of living children were significantly associated with the use of modern contraceptives. Conclusion Modern contraceptive use was high in the district. Couple’s discussion and husband approval of contraceptives use were significantly associated with the use of modern contraceptives. Therefore, district health office and concerned stakeholders should focus on couples to encourage communication and male involvement for family planning. PMID:24490810
NASA Astrophysics Data System (ADS)
Golmohammadi, A.; Jafarpour, B.; M Khaninezhad, M. R.
2017-12-01
Calibration of heterogeneous subsurface flow models leads to ill-posed nonlinear inverse problems, where too many unknown parameters are estimated from limited response measurements. When the underlying parameters form complex (non-Gaussian) structured spatial connectivity patterns, classical variogram-based geostatistical techniques cannot describe the underlying connectivity patterns. Modern pattern-based geostatistical methods that incorporate higher-order spatial statistics are more suitable for describing such complex spatial patterns. Moreover, when the underlying unknown parameters are discrete (geologic facies distribution), conventional model calibration techniques that are designed for continuous parameters cannot be applied directly. In this paper, we introduce a novel pattern-based model calibration method to reconstruct discrete and spatially complex facies distributions from dynamic flow response data. To reproduce complex connectivity patterns during model calibration, we impose a feasibility constraint to ensure that the solution follows the expected higher-order spatial statistics. For model calibration, we adopt a regularized least-squares formulation, involving data mismatch, pattern connectivity, and feasibility constraint terms. Using an alternating directions optimization algorithm, the regularized objective function is divided into a continuous model calibration problem, followed by mapping the solution onto the feasible set. The feasibility constraint to honor the expected spatial statistics is implemented using a supervised machine learning algorithm. The two steps of the model calibration formulation are repeated until the convergence criterion is met. Several numerical examples are used to evaluate the performance of the developed method.
[Applications of the hospital statistics management system].
Zhai, Hong; Ren, Yong; Liu, Jing; Li, You-Zhang; Ma, Xiao-Long; Jiao, Tao-Tao
2008-01-01
The Hospital Statistics Management System is built on an Office Automation Platform of Shandong provincial hospital system. Its workflow, role and popedom technologies are used to standardize and optimize the management program of statistics in the total quality control of hospital statistics. The system's applications have combined the office automation platform with the statistics management in a hospital and this provides a practical example of a modern hospital statistics management model.
Write-Skewed: Writing in an Introductory Statistics Course
ERIC Educational Resources Information Center
Delcham, Hendrick; Sezer, Renan
2010-01-01
Statistics is used in almost every facet of our daily lives: crime reports, election results, environmental/climate change, advances in business, financial planning, and progress in multifarious research. Although understanding statistics is essential for efficient functioning in the modern world (Cerrito 1996), students often do not grasp…
[Current status and trends in the health of the Moscow population].
Tishuk, E A; Plavunov, N F; Soboleva, N P
1997-01-01
Based on vast comprehensive medical statistical database, the authors analyze the health status of the population and the efficacy of public health service in Moscow. The pre-crisis tendencies and the modern status of public health under modern socioeconomic conditions are noted.
ERIC Educational Resources Information Center
Ramamurthy, Karthikeyan Natesan; Hinnov, Linda A.; Spanias, Andreas S.
2014-01-01
Modern data collection in the Earth Sciences has propelled the need for understanding signal processing and time-series analysis techniques. However, there is an educational disconnect in the lack of instruction of time-series analysis techniques in many Earth Science academic departments. Furthermore, there are no platform-independent freeware…
NASA Astrophysics Data System (ADS)
Figueroa, M. C.; Gregory, D. D.; Lyons, T. W.; Williford, K. H.
2017-12-01
Life processes affect trace element abundances in pyrite such that sedimentary and hydrothermal pyrite have significantly different trace element signatures. Thus, we propose that these biogeochemical data could be used to identify pyrite that formed biogenetically either early in our planet's history or on other planets, particularly Mars. The potential for this approach is elevated because pyrite is common in diverse sedimentary settings, and its trace element content can be preserved despite secondary overprints up to greenschist facies, thus minimizing the concerns about remobilization that can plague traditional whole rock studies. We are also including in-situ sulfur isotope analysis to further refine our understanding of the complex signatures of ancient pyrite. Sulfur isotope data can point straightforwardly to the involvement of life, because pyrite in sediments is inextricably linked to bacterial sulfate reduction and its diagnostic isotopic expressions. In addition to analyzing pyrite of known biological origin formed in the modern and ancient oceans under a range of conditions, we are building a data set for pyrite formed by hydrothermal and metamorphic processes to minimize the risk of false positives in life detection. We have used Random Forests (RF), a machine learning statistical technique with proven efficiency for classifying large geological datasets, to classify pyrite into biotic and abiotic end members. Coupling the trace element and sulfur isotope data from our analyses with a large existing dataset from diverse settings has yielded 4500 analyses with 18 different variables. Our initial results reveal the promise of the RF approach, correctly identifying biogenic pyrite 97 percent of the time. We will continue to couple new in-situ S-isotope and trace element analyses of biogenic pyrite grains from modern and ancient environments, using cutting-edge microanalytical techniques, with new data from high temperature settings. Our ultimately goal is a refined search tool with straightforward application in the search for early life on Earth and distant life recorded in meteorites, returned samples, and in situ measurements.
cudaMap: a GPU accelerated program for gene expression connectivity mapping
2013-01-01
Background Modern cancer research often involves large datasets and the use of sophisticated statistical techniques. Together these add a heavy computational load to the analysis, which is often coupled with issues surrounding data accessibility. Connectivity mapping is an advanced bioinformatic and computational technique dedicated to therapeutics discovery and drug re-purposing around differential gene expression analysis. On a normal desktop PC, it is common for the connectivity mapping task with a single gene signature to take > 2h to complete using sscMap, a popular Java application that runs on standard CPUs (Central Processing Units). Here, we describe new software, cudaMap, which has been implemented using CUDA C/C++ to harness the computational power of NVIDIA GPUs (Graphics Processing Units) to greatly reduce processing times for connectivity mapping. Results cudaMap can identify candidate therapeutics from the same signature in just over thirty seconds when using an NVIDIA Tesla C2050 GPU. Results from the analysis of multiple gene signatures, which would previously have taken several days, can now be obtained in as little as 10 minutes, greatly facilitating candidate therapeutics discovery with high throughput. We are able to demonstrate dramatic speed differentials between GPU assisted performance and CPU executions as the computational load increases for high accuracy evaluation of statistical significance. Conclusion Emerging ‘omics’ technologies are constantly increasing the volume of data and information to be processed in all areas of biomedical research. Embracing the multicore functionality of GPUs represents a major avenue of local accelerated computing. cudaMap will make a strong contribution in the discovery of candidate therapeutics by enabling speedy execution of heavy duty connectivity mapping tasks, which are increasingly required in modern cancer research. cudaMap is open source and can be freely downloaded from http://purl.oclc.org/NET/cudaMap. PMID:24112435
NASA Astrophysics Data System (ADS)
Li, J. D.; Spasojevic, M.; Inan, U. S.
2015-10-01
Wave injection experiments provide an opportunity to explore and quantify aspects of nonlinear wave-particle phenomena in a controlled manner. Waves are injected into space from ground-based ELF/VLF transmitters, and the modified waves are measured by radio receivers on the ground in the conjugate hemisphere. These experiments are expensive and challenging projects to build and to operate, and the transmitted waves are not always detected in the conjugate region. Even the powerful transmitter located at Siple Station, Antarctica in 1986, estimated to radiate over 1 kW, only reported a reception rate of ˜40%, indicating that a significant number of transmissions served no observable scientific purpose and reflecting the difficulty in determining suitable conditions for transmission and reception. Leveraging modern machine-learning classification techniques, we apply two statistical techniques, a Bayes and a support vector machine classifier, to predict the occurrence of detectable one-hop transmissions from Siple data with accuracies on the order of 80%-90%. Applying these classifiers to our 1986 Siple data set, we detect 406 receptions of Siple transmissions which we analyze to generate more robust statistics on nonlinear growth rates, 3 dB/s-270 dB/s, and nonlinear total amplification, 3 dB-41 dB.
Vexler, Albert; Yu, Jihnhee
2018-04-13
A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.
Alignment-free genetic sequence comparisons: a review of recent approaches by word analysis.
Bonham-Carter, Oliver; Steele, Joe; Bastola, Dhundy
2014-11-01
Modern sequencing and genome assembly technologies have provided a wealth of data, which will soon require an analysis by comparison for discovery. Sequence alignment, a fundamental task in bioinformatics research, may be used but with some caveats. Seminal techniques and methods from dynamic programming are proving ineffective for this work owing to their inherent computational expense when processing large amounts of sequence data. These methods are prone to giving misleading information because of genetic recombination, genetic shuffling and other inherent biological events. New approaches from information theory, frequency analysis and data compression are available and provide powerful alternatives to dynamic programming. These new methods are often preferred, as their algorithms are simpler and are not affected by synteny-related problems. In this review, we provide a detailed discussion of computational tools, which stem from alignment-free methods based on statistical analysis from word frequencies. We provide several clear examples to demonstrate applications and the interpretations over several different areas of alignment-free analysis such as base-base correlations, feature frequency profiles, compositional vectors, an improved string composition and the D2 statistic metric. Additionally, we provide detailed discussion and an example of analysis by Lempel-Ziv techniques from data compression. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Validating LES for Jet Aeroacoustics
NASA Technical Reports Server (NTRS)
Bridges, James
2011-01-01
Engineers charged with making jet aircraft quieter have long dreamed of being able to see exactly how turbulent eddies produce sound and this dream is now coming true with the advent of large eddy simulation (LES). Two obvious challenges remain: validating the LES codes at the resolution required to see the fluid-acoustic coupling, and the interpretation of the massive datasets that result in having dreams come true. This paper primarily addresses the former, the use of advanced experimental techniques such as particle image velocimetry (PIV) and Raman and Rayleigh scattering, to validate the computer codes and procedures used to create LES solutions. It also addresses the latter problem in discussing what are relevant measures critical for aeroacoustics that should be used in validating LES codes. These new diagnostic techniques deliver measurements and flow statistics of increasing sophistication and capability, but what of their accuracy? And what are the measures to be used in validation? This paper argues that the issue of accuracy be addressed by cross-facility and cross-disciplinary examination of modern datasets along with increased reporting of internal quality checks in PIV analysis. Further, it is argued that the appropriate validation metrics for aeroacoustic applications are increasingly complicated statistics that have been shown in aeroacoustic theory to be critical to flow-generated sound.
Ojelabi, Rapheal A; Afolabi, Adedeji O; Oyeyipo, Opeyemi O; Tunji-Olayeni, Patience F; Adewale, Bukola A
2018-06-01
Integrating social client relationship management (CRM 2.0) in the built environment can enhance the relationship between construction organizations and client towards sustaining a long and lasting collaboration. The data exploration analyzed the e-readiness of contracting and consulting construction firms in the uptake of CRM 2.0 and the barriers encountered in the adoption of the modern business tool. The targeted organizations consist of seventy five (75) construction businesses operating in Lagos State which were selected from a pool of registered contracting and consulting construction firms using random sampling technique. Descriptive statistics of the e-readiness of contracting and consulting construction firms for CRM 2.0 adoption and barriers limiting its uptake were analyzed. Also, inferential analysis using Mann-Whitney U statistical and independent sample t-test was performed on the dataset obtained. The data generated will support construction firms on the necessity to engage in client social relationship management in ensuring sustainable client relationship management in the built environment.
Robust Statistics: What They Are, and Why They Are So Important
ERIC Educational Resources Information Center
Corlu, Sencer M.
2009-01-01
The problem with "classical" statistics all invoking the mean is that these estimates are notoriously influenced by atypical scores (outliers), partly because the mean itself is differentially influenced by outliers. In theory, "modern" statistics may generate more replicable characterizations of data, because at least in some…
ERIC Educational Resources Information Center
Petocz, Peter; Sowey, Eric
2008-01-01
In this article, the authors focus on hypothesis testing--that peculiarly statistical way of deciding things. Statistical methods for testing hypotheses were developed in the 1920s and 1930s by some of the most famous statisticians, in particular Ronald Fisher, Jerzy Neyman and Egon Pearson, who laid the foundations of almost all modern methods of…
Detecting most influencing courses on students grades using block PCA
NASA Astrophysics Data System (ADS)
Othman, Osama H.; Gebril, Rami Salah
2014-12-01
One of the modern solutions adopted in dealing with the problem of large number of variables in statistical analyses is the Block Principal Component Analysis (Block PCA). This modified technique can be used to reduce the vertical dimension (variables) of the data matrix Xn×p by selecting a smaller number of variables, (say m) containing most of the statistical information. These selected variables can then be employed in further investigations and analyses. Block PCA is an adapted multistage technique of the original PCA. It involves the application of Cluster Analysis (CA) and variable selection throughout sub principal components scores (PC's). The application of Block PCA in this paper is a modified version of the original work of Liu et al (2002). The main objective was to apply PCA on each group of variables, (established using cluster analysis), instead of involving the whole large pack of variables which was proved to be unreliable. In this work, the Block PCA is used to reduce the size of a huge data matrix ((n = 41) × (p = 251)) consisting of Grade Point Average (GPA) of the students in 251 courses (variables) in the faculty of science in Benghazi University. In other words, we are constructing a smaller analytical data matrix of the GPA's of the students with less variables containing most variation (statistical information) in the original database. By applying the Block PCA, (12) courses were found to `absorb' most of the variation or influence from the original data matrix, and hence worth to be keep for future statistical exploring and analytical studies. In addition, the course Independent Study (Math.) was found to be the most influencing course on students GPA among the 12 selected courses.
Rodgers, Joseph Lee
2016-01-01
The Bayesian-frequentist debate typically portrays these statistical perspectives as opposing views. However, both Bayesian and frequentist statisticians have expanded their epistemological basis away from a singular focus on the null hypothesis, to a broader perspective involving the development and comparison of competing statistical/mathematical models. For frequentists, statistical developments such as structural equation modeling and multilevel modeling have facilitated this transition. For Bayesians, the Bayes factor has facilitated this transition. The Bayes factor is treated in articles within this issue of Multivariate Behavioral Research. The current presentation provides brief commentary on those articles and more extended discussion of the transition toward a modern modeling epistemology. In certain respects, Bayesians and frequentists share common goals.
Combining Feature Extraction Methods to Assist the Diagnosis of Alzheimer's Disease.
Segovia, F; Górriz, J M; Ramírez, J; Phillips, C
2016-01-01
Neuroimaging data as (18)F-FDG PET is widely used to assist the diagnosis of Alzheimer's disease (AD). Looking for regions with hypoperfusion/ hypometabolism, clinicians may predict or corroborate the diagnosis of the patients. Modern computer aided diagnosis (CAD) systems based on the statistical analysis of whole neuroimages are more accurate than classical systems based on quantifying the uptake of some predefined regions of interests (ROIs). In addition, these new systems allow determining new ROIs and take advantage of the huge amount of information comprised in neuroimaging data. A major branch of modern CAD systems for AD is based on multivariate techniques, which analyse a neuroimage as a whole, considering not only the voxel intensities but also the relations among them. In order to deal with the vast dimensionality of the data, a number of feature extraction methods have been successfully applied. In this work, we propose a CAD system based on the combination of several feature extraction techniques. First, some commonly used feature extraction methods based on the analysis of the variance (as principal component analysis), on the factorization of the data (as non-negative matrix factorization) and on classical magnitudes (as Haralick features) were simultaneously applied to the original data. These feature sets were then combined by means of two different combination approaches: i) using a single classifier and a multiple kernel learning approach and ii) using an ensemble of classifier and selecting the final decision by majority voting. The proposed approach was evaluated using a labelled neuroimaging database along with a cross validation scheme. As conclusion, the proposed CAD system performed better than approaches using only one feature extraction technique. We also provide a fair comparison (using the same database) of the selected feature extraction methods.
Metabolomics and Integrative Omics for the Development of Thai Traditional Medicine
Khoomrung, Sakda; Wanichthanarak, Kwanjeera; Nookaew, Intawat; Thamsermsang, Onusa; Seubnooch, Patcharamon; Laohapand, Tawee; Akarasereenont, Pravit
2017-01-01
In recent years, interest in studies of traditional medicine in Asian and African countries has gradually increased due to its potential to complement modern medicine. In this review, we provide an overview of Thai traditional medicine (TTM) current development, and ongoing research activities of TTM related to metabolomics. This review will also focus on three important elements of systems biology analysis of TTM including analytical techniques, statistical approaches and bioinformatics tools for handling and analyzing untargeted metabolomics data. The main objective of this data analysis is to gain a comprehensive understanding of the system wide effects that TTM has on individuals. Furthermore, potential applications of metabolomics and systems medicine in TTM will also be discussed. PMID:28769804
Advances in analytical chemistry
NASA Technical Reports Server (NTRS)
Arendale, W. F.; Congo, Richard T.; Nielsen, Bruce J.
1991-01-01
Implementation of computer programs based on multivariate statistical algorithms makes possible obtaining reliable information from long data vectors that contain large amounts of extraneous information, for example, noise and/or analytes that we do not wish to control. Three examples are described. Each of these applications requires the use of techniques characteristic of modern analytical chemistry. The first example, using a quantitative or analytical model, describes the determination of the acid dissociation constant for 2,2'-pyridyl thiophene using archived data. The second example describes an investigation to determine the active biocidal species of iodine in aqueous solutions. The third example is taken from a research program directed toward advanced fiber-optic chemical sensors. The second and third examples require heuristic or empirical models.
Modern morphometry: new perspectives in physical anthropology.
Mantini, Simone; Ripani, Maurizio
2009-06-01
In the past one hundred years physical anthropology has recourse to more and more efficient methods, which provide several new information regarding, human evolution and biology. Apart from the molecular approach, the introduction of new computed assisted techniques gave rise to a new concept of morphometry. Computed tomography and 3D-imaging, allowed providing anatomical description of the external and inner structures exceeding the problems encountered with the traditional morphometric methods. Furthermore, the support of geometric morphometrics, allowed creating geometric models to investigate morphological variation in terms of evolution, ontogeny and variability. The integration of these new tools gave rise to the virtual anthropology and to a new image of the anthropologist in which anatomical, biological, mathematical statistical and data processing information are fused in a multidisciplinary approach.
Statistical Research of Investment Development of Russian Regions
ERIC Educational Resources Information Center
Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.
2016-01-01
This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…
ERIC Educational Resources Information Center
Romeu, Jorge Luis
2008-01-01
This article discusses our teaching approach in graduate level Engineering Statistics. It is based on the use of modern technology, learning groups, contextual projects, simulation models, and statistical and simulation software to entice student motivation. The use of technology to facilitate group projects and presentations, and to generate,…
ERIC Educational Resources Information Center
Papadopoulos, Ioannis
2010-01-01
The issue of the area of irregular shapes is absent from the modern mathematical textbooks in elementary education in Greece. However, there exists a collection of books written for educational purposes by famous Greek scholars dating from the eighteenth century, which propose certain techniques concerning the estimation of the area of such…
ERIC Educational Resources Information Center
Singamsetti, Rao
2007-01-01
In this paper an attempt is made to highlight some issues of interpretation of statistical concepts and interpretation of results as taught in undergraduate Business statistics courses. The use of modern technology in the class room is shown to have increased the efficiency and the ease of learning and teaching in statistics. The importance of…
Modern Empirical Statistical Spectral Analysis.
1980-05-01
716-723. Akaike, H. (1977). On entropy maximization principle, Applications of Statistics, P.R. Krishnaiah , ed., North-Holland, Amsterdam, 27-41...by P. Krishnaiah , North Holland: Amsterdam, 283-295. Parzen, E. (1979). Forecasting and whitening filter estimation, TIMS Studies in the Management
NASA Astrophysics Data System (ADS)
Xu, Y.; Pearson, S. P.; Kilbourne, K.
2013-12-01
Tropical sea surface temperature (SST) has been implicated as a driver of climate changes during the Medieval Climate Anomaly (MCA, 950-1300 A.D.) but little data exists from the tropical oceans during this time period. We collected three modern and seven sub-fossil Diploria strigosa coral colonies from an overwash deposit on Anegada, British Virgin Islands (18.73 °N, 63.33 °W) in order to reconstruct climate in the northeastern Caribbean and Tropical North Atlantic during the MCA. The first step in our reconstruction was to verify the climate signal from this species at this site. We sub-sampled the modern corals along thecal walls with an average sampling resolution of 11-13 samples per year. Sr/Ca ratios measured in the sub-samples were calibrated to temperature using three different calibration techniques (ordinary least squares, reduced major axis, and weighted least squares (WLS)) on the monthly data that includes the seasonal cycles and on the monthly anomaly data. WLS regression accounts for unequal errors in the x and y terms, so we consider it the most robust technique. The WLS regression slope between gridded SST and coral Sr/Ca is similar to the previous two calibrations of this species. Mean Sr/Ca for each of the three modern corals is 8.993 × 0.004 mmol/mol, 9.127 × 0.003 mmol/mol, and 8.960 × 0.007 mmol/mol. These straddle the mean Diploria strigosa Sr/Ca found by Giry et al., (2010), 9.080 mmol/mol, at a site with nearly the same mean SST as Anegada (27.4 °C vs. 27.5 °C). The climatological seasonal cycles for SST derived from the modern corals are statistically indistinguishable from the seasonal cycles in the instrumental SST data. The coral-based seasonal cycles have ranges of 2.70 × 0.31 °C, 2.65 × 0.08 °C and 2.71 × 0.53 °C. These results indicate that this calibration can be applied to our sub-fossil coral data. We applied the WLS calibration to monthly-resolution Sr/Ca data from multiple sub-fossil corals dating to the medieval period with initial U-series dates near the top of the cores ranging from 1277 × 5 A.D. to 1327 × 5 A.D. Initial Sr/Ca results from the first sub-fossil coral have a seasonal range of 2.65 × 0.27 °C when converted to temperature units with our modern calibration, indicating no significant change from modern times. However, the mean Sr/Ca for this coral is very high (9.388 mmol/mol) compared to the modern corals. We explore the potential causes for this discrepancy in our study. Because reconstructing the mean SST during the Medieval Climate Anomaly may be difficult without temporal overlap with modern corals, our focus is on interannual variability. The coral Sr/Ca based monthly SST anomalies for both modern and sub-fossil corals have larger interannual variances than the instrumental record. One explanation for this is that the SSTs derived from sub-fossil corals are local data for which one expects larger variances than the instrumental data averaged over a 2 x 2 ° grid. This species shows great promise for future paleoclimate reconstructions.
Tegegn, Masresha; Arefaynie, Mastewal; Tiruye, Tenaw Yimer
2017-01-01
The contraceptive use of women in the extended postpartum period is usually different from other times in a woman's life cycle due to the additional roles and presence of emotional changes. However, there is lack of evidence regarding women contraceptive need during this period and the extent they met their need. Therefore, the objective of this study was to assess unmet need for modern contraceptives and associated factors among women during the extended postpartum period in Dessie Town, North east Ethiopia in December 2014. A community-based cross-sectional study was conducted among women who gave birth one year before the study period. Systematic random sampling technique was employed to recruit a total of 383 study participants. For data collection, a structured and pretested standard questionnaire was used. Descriptive statistics were done to characterize the study population using different variables. Bivariate and multiple logistic regression models were fitted to control confounding factors. Odds ratios with 95% confidence intervals were computed to identify factors associated with unmet need. This study revealed that 44% of the extended post-partum women had unmet need of modern contraceptives of which 57% unmet need for spacing and 43% for limiting. Education of women (being illiterate) (AOR (adjusted odds ratio) =3.37, 95% CI (confidence interval) 1.22-7.57), antenatal care service (no) (AOR = 2.41, 95% CI 1.11-5.79), Post-natal care service (no) (AOR = 3.63, CI 2.13-6.19) and knowledge of lactational amenorrhea method (AOR = 7.84 95% CI 4.10-15.02) were the factors positively associated with unmet need modern contraceptives in the extended postpartum period. The unmet need for modern contraception is high in the study area. There is need to improve the quality of maternal health service, girls education, information on postpartum risk of pregnancy on the recommended postpartum contraceptives to enable mothers make informed choices of contraceptives.
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
21 CFR 820.250 - Statistical techniques.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...
ERIC Educational Resources Information Center
Guler, Mustafa; Gursoy, Kadir; Guven, Bulent
2016-01-01
Understanding and interpreting biased data, decision-making in accordance with the data, and critically evaluating situations involving data are among the fundamental skills necessary in the modern world. To develop these required skills, emphasis on statistical literacy in school mathematics has been gradually increased in recent years. The…
Feasibility of Tactical Air Delivery Resupply Using Gliders
2016-12-01
using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...Sparrow,” using modern design and manufacturing techniques including AutoCAD, 3D printing , laser cutting and CorelDraw, and conducting field testing and...the desired point(s) of impact due to the atmospheric three-dimensional ( 3D ) wind and density field encountered by the descending load under canopy
Modern Education in China. Bulletin, 1919, No. 44
ERIC Educational Resources Information Center
Edmunds, Charles K.
1919-01-01
The Chinese conception of life's values is so different from that of western peoples that they have failed to develop modern technique and scientific knowledge. Now that they have come to see the value of these, rapid and fundamental changes are taking place. When modern scientific knowledge is added to the skill which the Chinese already have in…
A profile of the demographics and training characteristics of professional modern dancers.
Weiss, David S; Shah, Selina; Burchette, Raoul J
2008-01-01
Modern dancers are a unique group of artists, performing a diverse repertoire in dance companies of various sizes. In this study, 184 professional modern dancers in the United States (males N=49, females N=135), including members of large and small companies as well as freelance dancers, were surveyed regarding their demographics and training characteristics. The mean age of the dancers was 30.1 +/- 7.3 years, and they had danced professionally for 8.9 +/- 7.2 years. The average Body Mass Index (BMI) was 23.6 +/- 2.4 for males and 20.5 +/- 1.7 for females. Females had started taking dance class earlier (age 6.5 +/- 4.2 years) as compared to males (age 15.6 +/- 6.2 years). Females were more likely to have begun their training in ballet, while males more often began with modern classes (55% and 51% respectively, p < 0.0001). The professional modern dancers surveyed spent 8.3 +/- 6.0 hours in class and 17.2 +/- 12.6 hours in rehearsal each week. Eighty percent took modern technique class and 67% reported that they took ballet technique class. The dancers who specified what modern technique they studied (N=84) reported between two and four different techniques. The dancers also participated in a multitude of additional exercise regimens for a total of 8.2 +/- 6.6 hours per week, with the most common types being Pilates, yoga, and upper body weightlifting. The dancers wore many different types of footwear, depending on the style of dance being performed. For modern dance alone, dancers wore 12 different types of footwear. Reflecting the diversity of the dancers and companies surveyed, females reported performing for 23.3 +/- 14.0 weeks (range: 2-52 weeks) per year; males reported performing 20.4 +/- 13.9 weeks (range: 1-40) per year. Only 18% of the dancers did not have any health insurance, with 54% having some type of insurance provided by their employer. However, 23% of the dancers purchased their own insurance, and 22% had insurance provided by their families. Only 16% of dancers reported that they had Workers' Compensation coverage, despite the fact that they were all professionals, including many employed by major modern dance companies across the United States. It is concluded that understanding the training profile of the professional modern dancer should assist healthcare providers in supplying appropriate medical care for these performers.
Looking ahead in systems engineering
NASA Technical Reports Server (NTRS)
Feigenbaum, Donald S.
1966-01-01
Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.
Semiserin, V A; Khritinin, D F; Maev, I V; Karakozov, A T; Eremin, M N; Olenicheva, E L
2012-01-01
In this paper is synthesized current and recent data on the problem of metabolic syndrome (MS) in combination with toxic liver injury (CCI). Statistical parameters of the last 15 years, the dynamics of alimentary-constitutional obesity (ABC) in patients from the officers contracted service of Defense Ministry of Russia are reflected. Two-year experience in the application of modern non-invasive methods of diagnosis of liver fibrosis with a reflection of its dynamics on the background of complex treatment of patients with MS in conjunction with the Chamber on the example of 57 patients is shown. Paid great attention to psychological and emotional adjustment of patients with ABC, given the complex survey design and treatment in violation of motivational and behavioral responses. High clinical efficiency of combination drug therapy of MS and CCI, the diagnostic value of modern non-invasive methods of diagnosis of hepatic fibrosis are reliably performed. Technique of elastography significantly improves the liver clinical evaluation of the effectiveness of the therapy, allows for early detect the presence of the initial degree of hepatic fibrosis, choose the optimal treatment regimen and to evaluate the results dynamically.
Mixed-use development in a high-rise context
NASA Astrophysics Data System (ADS)
Generalova, Elena M.; Generalov, Viktor P.; Kuznetsova, Anna A.; Bobkova, Oksana N.
2018-03-01
The article deals with an actual problem of finding techniques and methods to create a comfortable urban environment. The authors emphasize that in the existing conditions of intensive urban development greater attention should be given to spatial concentration based on and more compact distribution of population in urban space. It is stressed that including mixed-use facilities into urban realm results in a significant improvement of living environment qualitative characteristics. The paper also examines modern approaches to constructing a «compact city» for comfortable and convenient living with a mixed-use tall building development. The authors explore the world's experience of designing tall mixed-use buildings and reveal modern trends in their construction. The statistics given is based on the data analysis of a group of tall mixed-use buildings consisting of more than 400 objects, constructed in 2007-2016. The research shows functional and architectural peculiarities of this typology of tall buildings and investigates a mechanism of creating zones of mixed-use tall building development in the urban structure. In conclusion, the authors consider prospects of development and major directions of improvement of mixed-use tall building parameters for a reasonable territorial urban growth and creation of high-density and comfortable building development.
Stansfield Bulygina, Ekaterina; Rasskasova, Anna; Berezina, Natalia; Soficaru, Andrei D
2017-09-01
Remains from several Eastern European and Siberian Mesolithic and Neolithic sites are analysed to clarify their biological relationships. We assume that groups' geographical distances correlate with genetic and, therefore, morphological distances between them. Material includes complete male crania from several Mesolithic and Neolithic burial sites across Northern Eurasia and from several modern populations. Geometric morphometrics and multivariate statistical techniques are applied to explore morphological trends, group distances, and correlations with their geographical position, climate, and the time of origin. Despite an overlap in the morphology among the modern and archeological groups, some of them show significant morphological distances. Geographical parameters account for only a small proportion of cranial variation in the sample, with larger variance explained by geography and age together. Expectations of isolation by distance are met in some but not in all cases. Climate accounts for a large proportion of autocorrelation with geography. Nearest-neighbor joining trees demonstrate group relationships predicted by the regression on geography and on climate. The obtained results are discussed in application to relationships between particular groups. Unlike the Ukrainian Mesolithic, the Yuzhny Oleni Ostrov Mesolithic displays a high morphological affinity with several groups from Northern Eurasia of both European and Asian origin. A possibility of a common substrate for the Yuzhny Oleni Ostrov Mesolithic and Siberian Neolithic groups is reviewed. The Siberian Neolithic is shown to have morphological connection with both modern Siberian groups and the Native North Americans. © 2017 Wiley Periodicals, Inc.
[Aerobic methylobacteria as promising objects of modern biotechnology].
Doronina, N V; Toronskava, L; Fedorov, D N; Trotsenko, Yu A
2015-01-01
The experimental data of the past decade concerning the metabolic peculiarities of aerobic meth ylobacteria and the prospects for their use in different fields of modern biotechnology, including genetic engineering techniques, have been summarized.
Dorfman, Kevin D
2018-02-01
The development of bright bisintercalating dyes for deoxyribonucleic acid (DNA) in the 1990s, most notably YOYO-1, revolutionized the field of polymer physics in the ensuing years. These dyes, in conjunction with modern molecular biology techniques, permit the facile observation of polymer dynamics via fluorescence microscopy and thus direct tests of different theories of polymer dynamics. At the same time, they have played a key role in advancing an emerging next-generation method known as genome mapping in nanochannels. The effect of intercalation on the bending energy of DNA as embodied by a change in its statistical segment length (or, alternatively, its persistence length) has been the subject of significant controversy. The precise value of the statistical segment length is critical for the proper interpretation of polymer physics experiments and controls the phenomena underlying the aforementioned genomics technology. In this perspective, we briefly review the model of DNA as a wormlike chain and a trio of methods (light scattering, optical or magnetic tweezers, and atomic force microscopy (AFM)) that have been used to determine the statistical segment length of DNA. We then outline the disagreement in the literature over the role of bisintercalation on the bending energy of DNA, and how a multiscale biomechanical approach could provide an important model for this scientifically and technologically relevant problem.
Applications of Mass Spectrometry Imaging for Safety Evaluation.
Bonnel, David; Stauber, Jonathan
2017-01-01
Mass spectrometry imaging (MSI) was first derived from techniques used in physics, which were then incorporated into chemistry followed by application in biology. Developed over 50 years ago, and with different principles to detect and map compounds on a sample surface, MSI supports modern biology questions by detecting biological compounds within tissue sections. MALDI (matrix-assisted laser desorption/ionization) imaging trend analysis in this field shows an important increase in the number of publications since 2005, especially with the development of the MALDI imaging technique and its applications in biomarker discovery and drug distribution. With recent improvements of statistical tools, absolute and relative quantification protocols, as well as quality and reproducibility evaluations, MALDI imaging has become one of the most reliable MSI techniques to support drug discovery and development phases. MSI allows to potentially address important questions in drug development such as "What is the localization of the drug and its metabolites in the tissues?", "What is the pharmacological effect of the drug in this particular region of interest?", or "Is the drug and its metabolites related to an atypical finding?" However, prior to addressing these questions using MSI techniques, expertise needs to be developed to become proficient at histological procedures (tissue preparation with frozen of fixed tissues), analytical chemistry, matrix application, instrumentation, informatics, and mathematics for data analysis and interpretation.
WE-A-201-02: Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niemierko, A.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
WE-A-201-00: Anne and Donald Herbert Distinguished Lectureship On Modern Statistical Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
Impact of the macroeconomic factors on university budgeting the US and Russia
NASA Astrophysics Data System (ADS)
Bogomolova, Arina; Balk, Igor; Ivachenko, Natalya; Temkin, Anatoly
2017-10-01
This paper discuses impact of macroeconomics factor on the university budgeting. Modern developments in the area of data science and machine learning made it possible to utilise automated techniques to address several problems of humankind ranging from genetic engineering and particle physics to sociology and economics. This paper is the first step to create a robust toolkit which will help universities sustain macroeconomic challenges utilising modern predictive analytics techniques.
Identification of Microorganisms by Modern Analytical Techniques.
Buszewski, Bogusław; Rogowska, Agnieszka; Pomastowski, Paweł; Złoch, Michał; Railean-Plugaru, Viorica
2017-11-01
Rapid detection and identification of microorganisms is a challenging and important aspect in a wide range of fields, from medical to industrial, affecting human lives. Unfortunately, classical methods of microorganism identification are based on time-consuming and labor-intensive approaches. Screening techniques require the rapid and cheap grouping of bacterial isolates; however, modern bioanalytics demand comprehensive bacterial studies at a molecular level. Modern approaches for the rapid identification of bacteria use molecular techniques, such as 16S ribosomal RNA gene sequencing based on polymerase chain reaction or electromigration, especially capillary zone electrophoresis and capillary isoelectric focusing. However, there are still several challenges with the analysis of microbial complexes using electromigration technology, such as uncontrolled aggregation and/or adhesion to the capillary surface. Thus, an approach using capillary electrophoresis of microbial aggregates with UV and matrix-assisted laser desorption ionization time-of-flight MS detection is presented.
ERIC Educational Resources Information Center
Vaughn, Brandon K.; Wang, Pei-Yu
2009-01-01
The emergence of technology has led to numerous changes in mathematical and statistical teaching and learning which has improved the quality of instruction and teacher/student interactions. The teaching of statistics, for example, has shifted from mathematical calculations to higher level cognitive abilities such as reasoning, interpretation, and…
ERIC Educational Resources Information Center
Rousson, Valentin
2014-01-01
It is well known that dichotomizing continuous data has the effect to decrease statistical power when the goal is to test for a statistical association between two variables. Modern researchers however are focusing not only on statistical significance but also on an estimation of the "effect size" (i.e., the strength of association…
Evaluating Application Resilience with XRay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Sui; Bronevetsky, Greg; Li, Bin
2015-05-07
The rising count and shrinking feature size of transistors within modern computers is making them increasingly vulnerable to various types of soft faults. This problem is especially acute in high-performance computing (HPC) systems used for scientific computing, because these systems include many thousands of compute cores and nodes, all of which may be utilized in a single large-scale run. The increasing vulnerability of HPC applications to errors induced by soft faults is motivating extensive work on techniques to make these applications more resiilent to such faults, ranging from generic techniques such as replication or checkpoint/restart to algorithmspecific error detection andmore » tolerance techniques. Effective use of such techniques requires a detailed understanding of how a given application is affected by soft faults to ensure that (i) efforts to improve application resilience are spent in the code regions most vulnerable to faults and (ii) the appropriate resilience technique is applied to each code region. This paper presents XRay, a tool to view the application vulnerability to soft errors, and illustrates how XRay can be used in the context of a representative application. In addition to providing actionable insights into application behavior XRay automatically selects the number of fault injection experiments required to provide an informative view of application behavior, ensuring that the information is statistically well-grounded without performing unnecessary experiments.« less
NASA Astrophysics Data System (ADS)
Nikolopoulos, E. I.; Destro, E.; Bhuiyan, M. A. E.; Borga, M., Sr.; Anagnostou, E. N.
2017-12-01
Fire disasters affect modern societies at global scale inducing significant economic losses and human casualties. In addition to their direct impacts they have various adverse effects on hydrologic and geomorphologic processes of a region due to the tremendous alteration of the landscape characteristics (vegetation, soil properties etc). As a consequence, wildfires often initiate a cascade of hazards such as flash floods and debris flows that usually follow the occurrence of a wildfire thus magnifying the overall impact in a region. Post-fire debris flows (PFDF) is one such type of hazards frequently occurring in Western United States where wildfires are a common natural disaster. Prediction of PDFD is therefore of high importance in this region and over the last years a number of efforts from United States Geological Survey (USGS) and National Weather Service (NWS) have been focused on the development of early warning systems that will help mitigate PFDF risk. This work proposes a prediction framework that is based on a nonparametric statistical technique (random forests) that allows predicting the occurrence of PFDF at regional scale with a higher degree of accuracy than the commonly used approaches that are based on power-law thresholds and logistic regression procedures. The work presented is based on a recently released database from USGS that reports a total of 1500 storms that triggered and did not trigger PFDF in a number of fire affected catchments in Western United States. The database includes information on storm characteristics (duration, accumulation, max intensity etc) and other auxiliary information of land surface properties (soil erodibility index, local slope etc). Results show that the proposed model is able to achieve a satisfactory prediction accuracy (threat score > 0.6) superior of previously published prediction frameworks highlighting the potential of nonparametric statistical techniques for development of PFDF prediction systems.
Do, Mai P; Kincaid, D Lawrence
2006-01-01
Shabuj Chaya is a weekly television drama broadcast during a 13-week period in Bangladesh in 2000. It used an entertainment-education format to increase health knowledge and to promote visits to health clinic and modern contraceptive use. The purpose of this article is to demonstrate how a relatively new statistical technique, propensity score matching in conjunction with structural equation modeling, can be used to obtain an unbiased estimate of changes in health outcomes that can be attributed to exposure to the drama. The analysis is conducted with data from an after-only, cross-sectional survey of 4,492 men and women from the intended audience. The results from propensity score matching approximate what would be expected from a randomized control group design.
Teye, Joseph Kofi
2013-06-01
This study examines the socio-demographic determinants of modern contraceptive use among women in the Asuogyaman district of Ghana. The results reveal that although 97% of the survey respondents knew of at least one modern method of contraception, only 16% of them were using modern contraceptives. Statistical tests show that level of education, place of residence, and work status significantly influence modern contraceptive use among women in the study area. Fear of side effects, desire for more children, and partner's disapproval were the main barriers to modern contraceptive use in the study area. The use of traditional methods of contraception was very high because of the perception that they are safer. Based on these findings, it has been suggested that in addition to making family planning services available and accessible, health workers must address attitudinal factors such as fear of side effects and high fertility preferences.
Classification without labels: learning from mixed samples in high energy physics
NASA Astrophysics Data System (ADS)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-01
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimal classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.
Classification without labels: learning from mixed samples in high energy physics
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
2017-10-25
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
Classification without labels: learning from mixed samples in high energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metodiev, Eric M.; Nachman, Benjamin; Thaler, Jesse
Modern machine learning techniques can be used to construct powerful models for difficult collider physics problems. In many applications, however, these models are trained on imperfect simulations due to a lack of truth-level information in the data, which risks the model learning artifacts of the simulation. In this paper, we introduce the paradigm of classification without labels (CWoLa) in which a classifier is trained to distinguish statistical mixtures of classes, which are common in collider physics. Crucially, neither individual labels nor class proportions are required, yet we prove that the optimal classifier in the CWoLa paradigm is also the optimalmore » classifier in the traditional fully-supervised case where all label information is available. After demonstrating the power of this method in an analytical toy example, we consider a realistic benchmark for collider physics: distinguishing quark- versus gluon-initiated jets using mixed quark/gluon training samples. More generally, CWoLa can be applied to any classification problem where labels or class proportions are unknown or simulations are unreliable, but statistical mixtures of the classes are available.« less
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
Quantum Theory of Superresolution for Incoherent Optical Imaging
NASA Astrophysics Data System (ADS)
Tsang, Mankei
Rayleigh's criterion for resolving two incoherent point sources has been the most influential measure of optical imaging resolution for over a century. In the context of statistical image processing, violation of the criterion is especially detrimental to the estimation of the separation between the sources, and modern far-field superresolution techniques rely on suppressing the emission of close sources to enhance the localization precision. Using quantum optics, quantum metrology, and statistical analysis, here we show that, even if two close incoherent sources emit simultaneously, measurements with linear optics and photon counting can estimate their separation from the far field almost as precisely as conventional methods do for isolated sources, rendering Rayleigh's criterion irrelevant to the problem. Our results demonstrate that superresolution can be achieved not only for fluorophores but also for stars. Recent progress in generalizing our theory for multiple sources and spectroscopy will also be discussed. This work is supported by the Singapore National Research Foundation under NRF Grant No. NRF-NRFF2011-07 and the Singapore Ministry of Education Academic Research Fund Tier 1 Project R-263-000-C06-112.
NASA Astrophysics Data System (ADS)
Podladchikova, O.
2002-02-01
The high temperature of the solar corona is still a puzzling problem of solar physics. However, the recent observations of satellites SoHO, Yohkoh or TRACE seem to indicate that the processes responsible for the heating of the closed regions are situated in the low corona or in the chromosphere, thus close to the sun surface, and are associated to the direct currents dissipation. Statistical data analysis suggest that the heating mechanisms result thus from numerous events of current layers dissipation of small scale and weak energy, on the resolution limit of modern instruments. We propose a statistical lattice model, resulting from an approach more physical than self-organized criticality, constituted by a magnetic energy source at small scales and by dissipation mechanisms of the currents, which can be associated either to magnetic reconnection or to anomalous resistivity. The various types of sources and mechanisms of dissipation allow to study their influence on the statistical properties of the system, in particular on the energy dissipation. With the aim of quantifying this behavior and allowing detailed comparisons between models and observations, analysis techniques little used in solar physics, such as the singular values decomposition, entropies, or Pearson technique of PDF classification are introduced and applied to the study of the spatial and temporal properties of the model. La température anormalement élevée de la couronne reste un des problèmes majeurs de la physique solaire. Toutefois, les observations récentes des satellites SoHO, Yohkoh ou TRACE semblent indiquer que les processus responsables du chauffage des régions fermées se situent dans la basse couronne ou dans la chromosphère, donc proches de la surface solaire, et sont associés à la dissipation de couches de courant continu. L'analyse statistique de données suggère que les mécanismes de chauffage résulteraient donc de nombreux événements de dissipation de couches de courant de petite échelle et de faible énergie, àla limite de la résolution des instruments modernes. Nous proposons un modèle statistique sur réseau, résultant d'une approche plus physique que la criticalité auto-organisée, constitué d'une source d'énergie magnétique de petite échelle et de mécanismes de dissipation des courants, qui peuvent être associés soit à la reconnection magnétique soit à la résistivité anormale. Les différents types de sources et de mécanismes de dissipation permettent d'étudier leur influence sur les propriétés statistiques du système, en particulier sur l'énergie dissipée. Dans le but de quantifier ces comportements et de permettre des comparaisons approfondies entres les modèles et les observations, des techniques d'analyse peu utilisées en physique solaire, telles que la décomposition en valeurs singulières, des entropies, ou la technique de Pearson de classification des densités de probabilité, sont introduites et appliquées `a l'étude des propriétés spatiales et temporelles du modèle.
Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
NASA Astrophysics Data System (ADS)
Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr
2017-10-01
Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.
A new JPEG-based steganographic algorithm for mobile devices
NASA Astrophysics Data System (ADS)
Agaian, Sos S.; Cherukuri, Ravindranath C.; Schneider, Erik C.; White, Gregory B.
2006-05-01
Currently, cellular phones constitute a significant portion of the global telecommunications market. Modern cellular phones offer sophisticated features such as Internet access, on-board cameras, and expandable memory which provide these devices with excellent multimedia capabilities. Because of the high volume of cellular traffic, as well as the ability of these devices to transmit nearly all forms of data. The need for an increased level of security in wireless communications is becoming a growing concern. Steganography could provide a solution to this important problem. In this article, we present a new algorithm for JPEG-compressed images which is applicable to mobile platforms. This algorithm embeds sensitive information into quantized discrete cosine transform coefficients obtained from the cover JPEG. These coefficients are rearranged based on certain statistical properties and the inherent processing and memory constraints of mobile devices. Based on the energy variation and block characteristics of the cover image, the sensitive data is hidden by using a switching embedding technique proposed in this article. The proposed system offers high capacity while simultaneously withstanding visual and statistical attacks. Based on simulation results, the proposed method demonstrates an improved retention of first-order statistics when compared to existing JPEG-based steganographic algorithms, while maintaining a capacity which is comparable to F5 for certain cover images.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework.
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-06-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd.
Bastistella, Luciane; Rousset, Patrick; Aviz, Antonio; Caldeira-Pires, Armando; Humbert, Gilles; Nogueira, Manoel
2018-02-09
New experimental techniques, as well as modern variants on known methods, have recently been employed to investigate the fundamental reactions underlying the oxidation of biochar. The purpose of this paper was to experimentally and statistically study how the relative humidity of air, mass, and particle size of four biochars influenced the adsorption of water and the increase in temperature. A random factorial design was employed using the intuitive statistical software Xlstat. A simple linear regression model and an analysis of variance with a pairwise comparison were performed. The experimental study was carried out on the wood of Quercus pubescens , Cyclobalanopsis glauca , Trigonostemon huangmosun , and Bambusa vulgaris , and involved five relative humidity conditions (22, 43, 75, 84, and 90%), two mass samples (0.1 and 1 g), and two particle sizes (powder and piece). Two response variables including water adsorption and temperature increase were analyzed and discussed. The temperature did not increase linearly with the adsorption of water. Temperature was modeled by nine explanatory variables, while water adsorption was modeled by eight. Five variables, including factors and their interactions, were found to be common to the two models. Sample mass and relative humidity influenced the two qualitative variables, while particle size and biochar type only influenced the temperature.
Resolving the Antarctic contribution to sea-level rise: a hierarchical modelling framework†
Zammit-Mangion, Andrew; Rougier, Jonathan; Bamber, Jonathan; Schön, Nana
2014-01-01
Determining the Antarctic contribution to sea-level rise from observational data is a complex problem. The number of physical processes involved (such as ice dynamics and surface climate) exceeds the number of observables, some of which have very poor spatial definition. This has led, in general, to solutions that utilise strong prior assumptions or physically based deterministic models to simplify the problem. Here, we present a new approach for estimating the Antarctic contribution, which only incorporates descriptive aspects of the physically based models in the analysis and in a statistical manner. By combining physical insights with modern spatial statistical modelling techniques, we are able to provide probability distributions on all processes deemed to play a role in both the observed data and the contribution to sea-level rise. Specifically, we use stochastic partial differential equations and their relation to geostatistical fields to capture our physical understanding and employ a Gaussian Markov random field approach for efficient computation. The method, an instantiation of Bayesian hierarchical modelling, naturally incorporates uncertainty in order to reveal credible intervals on all estimated quantities. The estimated sea-level rise contribution using this approach corroborates those found using a statistically independent method. © 2013 The Authors. Environmetrics Published by John Wiley & Sons, Ltd. PMID:25505370
Application of the Statistical ICA Technique in the DANCE Data Analysis
NASA Astrophysics Data System (ADS)
Baramsai, Bayarbadrakh; Jandel, M.; Bredeweg, T. A.; Rusev, G.; Walker, C. L.; Couture, A.; Mosby, S.; Ullmann, J. L.; Dance Collaboration
2015-10-01
The Detector for Advanced Neutron Capture Experiments (DANCE) at the Los Alamos Neutron Science Center is used to improve our understanding of the neutron capture reaction. DANCE is a highly efficient 4 π γ-ray detector array consisting of 160 BaF2 crystals which make it an ideal tool for neutron capture experiments. The (n, γ) reaction Q-value equals to the sum energy of all γ-rays emitted in the de-excitation cascades from the excited capture state to the ground state. The total γ-ray energy is used to identify reactions on different isotopes as well as the background. However, it's challenging to identify contribution in the Esum spectra from different isotopes with the similar Q-values. Recently we have tested the applicability of modern statistical methods such as Independent Component Analysis (ICA) to identify and separate different (n, γ) reaction yields on different isotopes that are present in the target material. ICA is a recently developed computational tool for separating multidimensional data into statistically independent additive subcomponents. In this conference talk, we present some results of the application of ICA algorithms and its modification for the DANCE experimental data analysis. This research is supported by the U. S. Department of Energy, Office of Science, Nuclear Physics under the Early Career Award No. LANL20135009.
Christou, Nicolas; Dinov, Ivo D
2010-09-01
Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources.
Christou, Nicolas; Dinov, Ivo D.
2011-01-01
Many modern technological advances have direct impact on the format, style and efficacy of delivery and consumption of educational content. For example, various novel communication and information technology tools and resources enable efficient, timely, interactive and graphical demonstrations of diverse scientific concepts. In this manuscript, we report on a meta-study of 3 controlled experiments of using the Statistics Online Computational Resources in probability and statistics courses. Web-accessible SOCR applets, demonstrations, simulations and virtual experiments were used in different courses as treatment and compared to matched control classes utilizing traditional pedagogical approaches. Qualitative and quantitative data we collected for all courses included Felder-Silverman-Soloman index of learning styles, background assessment, pre and post surveys of attitude towards the subject, end-point satisfaction survey, and varieties of quiz, laboratory and test scores. Our findings indicate that students' learning styles and attitudes towards a discipline may be important confounds of their final quantitative performance. The observed positive effects of integrating information technology with established pedagogical techniques may be valid across disciplines within the broader spectrum courses in the science education curriculum. The two critical components of improving science education via blended instruction include instructor training, and development of appropriate activities, simulations and interactive resources. PMID:21603097
The Evolution of Random Number Generation in MUVES
2017-01-01
mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results identical to the current...MUVES, includ- ing the mathematical basis and statistical justification for algorithms used in the code. The working code provided produces results...questionable numerical and statistical properties. The development of the modern system is traced through software change requests, resulting in a random number
Locality-Aware CTA Clustering For Modern GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Liu, Weifeng
2017-04-08
In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less
Modern adjuncts and technologies in microsurgery: an historical and evidence-based review.
Pratt, George F; Rozen, Warren M; Chubb, Daniel; Whitaker, Iain S; Grinsell, Damien; Ashton, Mark W; Acosta, Rafael
2010-11-01
While modern reconstructive surgery was revolutionized with the introduction of microsurgical techniques, microsurgery itself has seen the introduction of a range of technological aids and modern techniques aiming to improve dissection times, anastomotic times, and overall outcomes. These include improved preoperative planning, anastomotic aides, and earlier detection of complications with higher salvage rates. Despite the potential for substantial impact, many of these techniques have been evaluated in a limited fashion, and the evidence for each has not been universally explored. The purpose of this review was to establish and quantify the evidence for each technique. A search of relevant medical databases was performed to identify literature providing evidence for each technology. Levels of evidence were thus accumulated and applied to each technique. There is a relative paucity of evidence for many of the more recent technologies described in the field of microsurgery, with no randomized controlled trials, and most studies in the field comprising case series only. Current evidence-based suggestions include the use of computed tomographic angiography (CTA) for the preoperative planning of perforator flaps, the intraoperative use of a mechanical anastomotic coupling aide (particularly the Unilink® coupler), and postoperative flap monitoring with strict protocols using clinical bedside monitoring and/or the implantable Doppler probe. Despite the breadth of technologies introduced into the field of microsurgery, there is substantial variation in the degree of evidence presented for each, suggesting the role for much future research, particularly from emerging technologies such as robotics and modern simulators. Copyright © 2010 Wiley-Liss, Inc.
A Survey of Architectural Techniques For Improving Cache Power Efficiency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
Modern processors are using increasingly larger sized on-chip caches. Also, with each CMOS technology generation, there has been a significant increase in their leakage energy consumption. For this reason, cache power management has become a crucial research issue in modern processor design. To address this challenge and also meet the goals of sustainable computing, researchers have proposed several techniques for improving energy efficiency of cache architectures. This paper surveys recent architectural techniques for improving cache power efficiency and also presents a classification of these techniques based on their characteristics. For providing an application perspective, this paper also reviews several real-worldmore » processor chips that employ cache energy saving techniques. The aim of this survey is to enable engineers and researchers to get insights into the techniques for improving cache power efficiency and motivate them to invent novel solutions for enabling low-power operation of caches.« less
Zakhia, Frédéric; de Lajudie, Philippe
2006-03-01
Taxonomy is the science that studies the relationships between organisms. It comprises classification, nomenclature, and identification. Modern bacterial taxonomy is polyphasic. This means that it is based on several molecular techniques, each one retrieving the information at different cellular levels (proteins, fatty acids, DNA...). The obtained results are combined and analysed to reach a "consensus taxonomy" of a microorganism. Until 1970, a small number of classification techniques were available for microbiologists (mainly phenotypic characterization was performed: a legume species nodulation ability for a Rhizobium, for example). With the development of techniques based on polymerase chain reaction for characterization, the bacterial taxonomy has undergone great changes. In particular, the classification of the legume nodulating bacteria has been repeatedly modified over the last 20 years. We present here a review of the currently used molecular techniques in bacterial characterization, with examples of application of these techniques for the study of the legume nodulating bacteria.
NASA Technical Reports Server (NTRS)
Manford, J. S.; Bennett, G. R.
1985-01-01
The Space Station Program will incorporate analysis of operations constraints and considerations in the early design phases to avoid the need for later modifications to the Space Station for operations. The application of modern tools and administrative techniques to minimize the cost of performing effective orbital operations planning and design analysis in the preliminary design phase of the Space Station Program is discussed. Tools and techniques discussed include: approach for rigorous analysis of operations functions, use of the resources of a large computer network, and providing for efficient research and access to information.
Uehleke, Bernhard; Hopfenmueller, Werner; Stange, Rainer; Saller, Reinhard
2012-01-01
Ancient and medieval herbal books are often believed to describe the same claims still in use today. Medieval herbal books, however, provide long lists of claims for each herb, most of which are not approved today, while the herb's modern use is often missing. So the hypothesis arises that a medieval author could have randomly hit on 'correct' claims among his many 'wrong' ones. We developed a statistical procedure based on a simple probability model. We applied our procedure to the herbal books of Hildegard von Bingen (1098- 1179) as an example for its usefulness. Claim attributions for a certain herb were classified as 'correct' if approximately the same as indicated in actual monographs. The number of 'correct' claim attributions was significantly higher than it could have been by pure chance, even though the vast majority of Hildegard von Bingen's claims were not 'correct'. The hypothesis that Hildegard would have achieved her 'correct' claims purely by chance can be clearly rejected. The finding that medical claims provided by a medieval author are significantly related to modern herbal use supports the importance of traditional medicinal systems as an empirical source. However, since many traditional claims are not in accordance with modern applications, they should be used carefully and analyzed in a systematic, statistics-based manner. Our statistical approach can be used for further systematic comparison of herbal claims of traditional sources as well as in the fields of ethnobotany and ethnopharmacology. Copyright © 2012 S. Karger AG, Basel.
Colon-Berlingeri, Migdalisel; Burrowes, Patricia A.
2011-01-01
Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math–biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology. PMID:21885822
Colon-Berlingeri, Migdalisel; Burrowes, Patricia A
2011-01-01
Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.
NASA Astrophysics Data System (ADS)
Jogesh Babu, G.
2017-01-01
A year-long research (Aug 2016- May 2017) program on `Statistical, Mathematical and Computational Methods for Astronomy (ASTRO)’ is well under way at Statistical and Applied Mathematical Sciences Institute (SAMSI), a National Science Foundation research institute in Research Triangle Park, NC. This program has brought together astronomers, computer scientists, applied mathematicians and statisticians. The main aims of this program are: to foster cross-disciplinary activities; to accelerate the adoption of modern statistical and mathematical tools into modern astronomy; and to develop new tools needed for important astronomical research problems. The program provides multiple avenues for cross-disciplinary interactions, including several workshops, long-term visitors, and regular teleconferences, so participants can continue collaborations, even if they can only spend limited time in residence at SAMSI. The main program is organized around five working groups:i) Uncertainty Quantification and Astrophysical Emulationii) Synoptic Time Domain Surveysiii) Multivariate and Irregularly Sampled Time Seriesiv) Astrophysical Populationsv) Statistics, computation, and modeling in cosmology.A brief description of each of the work under way by these groups will be given. Overlaps among various working groups will also be highlighted. How the wider astronomy community can both participate and benefit from the activities, will be briefly mentioned.
Yang, Guang-Fu; Huang, Xiaoqin
2006-01-01
Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques.
Chinawa, Josephat M; Manyike, Pius; Chukwu, B; Eke, C B; Isreal, Odetunde Odutola; Chinawa, A T
2015-01-01
Medical education is always in a state of dynamic equilibrium with continuous evolution of new techniques in teaching and learning. Objective of this study is to determine medical students' perception on preferences of teaching and learning. A total of 207 medical students participated in the study. Most (73.9%) of them were males while the modal age group was 23-25 years. Majority (57.5%) of the students belong the middle socioeconomic class and 65.7% resided within the hostel. Majority of the students (48.8%) believe two hours is enough to per lecture. Among the five different teaching-learning methods investigated, use of multimedia methods was found to be most effective. There exist a statistically significant association was found only in gender with regular oral examinations (Χ2 = 4.5, df = 1, p = 0.03) and socioeconomic class with dictation of lecture notes (Χ2 = 17.9, df = 9, p = 0.03). The present day medical student will end up as a good clinician if modern techniques of teaching and communication skills of the lecturers are adopted.
Price, Jeffrey H; Goodacre, Angela; Hahn, Klaus; Hodgson, Louis; Hunter, Edward A; Krajewski, Stanislaw; Murphy, Robert F; Rabinovich, Andrew; Reed, John C; Heynen, Susanne
2002-01-01
Cellular behavior is complex. Successfully understanding systems at ever-increasing complexity is fundamental to advances in modern science and unraveling the functional details of cellular behavior is no exception. We present a collection of prospectives to provide a glimpse of the techniques that will aid in collecting, managing and utilizing information on complex cellular processes via molecular imaging tools. These include: 1) visualizing intracellular protein activity with fluorescent markers, 2) high throughput (and automated) imaging of multilabeled cells in statistically significant numbers, and 3) machine intelligence to analyze subcellular image localization and pattern. Although not addressed here, the importance of combining cell-image-based information with detailed molecular structure and ligand-receptor binding models cannot be overlooked. Advanced molecular imaging techniques have the potential to impact cellular diagnostics for cancer screening, clinical correlations of tissue molecular patterns for cancer biology, and cellular molecular interactions for accelerating drug discovery. The goal of finally understanding all cellular components and behaviors will be achieved by advances in both instrumentation engineering (software and hardware) and molecular biochemistry. Copyright 2002 Wiley-Liss, Inc.
[Curricular design of health postgraduate programs: the case of Masters in epidemiology].
Bobadilla, J L; Lozano, R; Bobadilla, C
1991-01-01
This paper discusses the need to create specific programs for the training of researchers in epidemiology, a field that has traditionally been ignored by the graduate programs in public health. This is due, in part, to the emphasis that has been placed on the training of professionals in other areas of public health. The paper also includes the results of a consensus exercise developed during the curricular design of the Masters Program in Epidemiology of the School of Medicine of the National Autonomous University of Mexico. The technique used during the consensus exercise was the TKJ, which allows the presentation of ideas and possible solutions for a specific problem. This is probably the first published experience in the use of such a technique for the design of an academic curriculum. Taking as a base the general characteristics of the students, the substantive, disciplinary and methodological subjects were chosen. The results showed a need for a multidisciplinary approach based on modern methodologies of statistics and epidemiology. The usefulness of the results of the curricular design and the superiority of this method to reach consensus is also discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Illidge, Tim, E-mail: Tim.Illidge@ics.manchester.ac.uk; Specht, Lena; Yahalom, Joachim
2014-05-01
Radiation therapy (RT) is the most effective single modality for local control of non-Hodgkin lymphoma (NHL) and is an important component of therapy for many patients. Many of the historic concepts of dose and volume have recently been challenged by the advent of modern imaging and RT planning tools. The International Lymphoma Radiation Oncology Group (ILROG) has developed these guidelines after multinational meetings and analysis of available evidence. The guidelines represent an agreed consensus view of the ILROG steering committee on the use of RT in NHL in the modern era. The roles of reduced volume and reduced doses aremore » addressed, integrating modern imaging with 3-dimensional planning and advanced techniques of RT delivery. In the modern era, in which combined-modality treatment with systemic therapy is appropriate, the previously applied extended-field and involved-field RT techniques that targeted nodal regions have now been replaced by limiting the RT to smaller volumes based solely on detectable nodal involvement at presentation. A new concept, involved-site RT, defines the clinical target volume. For indolent NHL, often treated with RT alone, larger fields should be considered. Newer treatment techniques, including intensity modulated RT, breath holding, image guided RT, and 4-dimensional imaging, should be implemented, and their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control.« less
Modern quantitative schlieren techniques
NASA Astrophysics Data System (ADS)
Hargather, Michael; Settles, Gary
2010-11-01
Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.
Application of contrast media in post-mortem imaging (CT and MRI).
Grabherr, Silke; Grimm, Jochen; Baumann, Pia; Mangin, Patrice
2015-09-01
The application of contrast media in post-mortem radiology differs from clinical approaches in living patients. Post-mortem changes in the vascular system and the absence of blood flow lead to specific problems that have to be considered for the performance of post-mortem angiography. In addition, interpreting the images is challenging due to technique-related and post-mortem artefacts that have to be known and that are specific for each applied technique. Although the idea of injecting contrast media is old, classic methods are not simply transferable to modern radiological techniques in forensic medicine, as they are mostly dedicated to single-organ studies or applicable only shortly after death. With the introduction of modern imaging techniques, such as post-mortem computed tomography (PMCT) and post-mortem magnetic resonance (PMMR), to forensic death investigations, intensive research started to explore their advantages and limitations compared to conventional autopsy. PMCT has already become a routine investigation in several centres, and different techniques have been developed to better visualise the vascular system and organ parenchyma in PMCT. In contrast, the use of PMMR is still limited due to practical issues, and research is now starting in the field of PMMR angiography. This article gives an overview of the problems in post-mortem contrast media application, the various classic and modern techniques, and the issues to consider by using different media.
Adaptive designs in clinical trials.
Bowalekar, Suresh
2011-01-01
In addition to the expensive and lengthy process of developing a new medicine, the attrition rate in clinical research was on the rise, resulting in stagnation in the development of new compounds. As a consequence to this, the US Food and Drug Administration released a critical path initiative document in 2004, highlighting the need for developing innovative trial designs. One of the innovations suggested the use of adaptive designs for clinical trials. Thus, post critical path initiative, there is a growing interest in using adaptive designs for the development of pharmaceutical products. Adaptive designs are expected to have great potential to reduce the number of patients and duration of trial and to have relatively less exposure to new drug. Adaptive designs are not new in the sense that the task of interim analysis (IA)/review of the accumulated data used in adaptive designs existed in the past too. However, such reviews/analyses of accumulated data were not necessarily planned at the stage of planning clinical trial and the methods used were not necessarily compliant with clinical trial process. The Bayesian approach commonly used in adaptive designs was developed by Thomas Bayes in the 18th century, about hundred years prior to the development of modern statistical methods by the father of modern statistics, Sir Ronald A. Fisher, but the complexity involved in Bayesian approach prevented its use in real life practice. The advances in the field of computer and information technology over the last three to four decades has changed the scenario and the Bayesian techniques are being used in adaptive designs in addition to other sequential methods used in IA. This paper attempts to describe the various adaptive designs in clinical trial and views of stakeholders about feasibility of using them, without going into mathematical complexities.
The energetic cost of walking: a comparison of predictive methods.
Kramer, Patricia Ann; Sylvester, Adam D
2011-01-01
The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is "best", but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species.
A revised burial dose estimation procedure for optical dating of youngand modern-age sediments
Arnold, L.J.; Roberts, R.G.; Galbraith, R.F.; DeLong, S.B.
2009-01-01
The presence of genuinely zero-age or near-zero-age grains in modern-age and very young samples poses a problem for many existing burial dose estimation procedures used in optical (optically stimulated luminescence, OSL) dating. This difficulty currently necessitates consideration of relatively simplistic and statistically inferior age models. In this study, we investigate the potential for using modified versions of the statistical age models of Galbraith et??al. [Galbraith, R.F., Roberts, R.G., Laslett, G.M., Yoshida, H., Olley, J.M., 1999. Optical dating of single and multiple grains of quartz from Jinmium rock shelter, northern Australia: Part I, experimental design and statistical models. Archaeometry 41, 339-364.] to provide reliable equivalent dose (De) estimates for young and modern-age samples that display negative, zero or near-zero De estimates. For this purpose, we have revised the original versions of the central and minimum age models, which are based on log-transformed De values, so that they can be applied to un-logged De estimates and their associated absolute standard errors. The suitability of these 'un-logged' age models is tested using a series of known-age fluvial samples deposited within two arroyo systems from the American Southwest. The un-logged age models provide accurate burial doses and final OSL ages for roughly three-quarters of the total number of samples considered in this study. Sensitivity tests reveal that the un-logged versions of the central and minimum age models are capable of producing accurate burial dose estimates for modern-age and very young (<350??yr) fluvial samples that contain (i) more than 20% of well-bleached grains in their De distributions, or (ii) smaller sub-populations of well-bleached grains for which the De values are known with high precision. Our results indicate that the original (log-transformed) versions of the central and minimum age models are still preferable for most routine dating applications, since these age models are better suited to the statistical properties of typical single-grain and multi-grain single-aliquot De datasets. However, the unique error properties of modern-age samples, combined with the problems of calculating natural logarithms of negative or zero-Gy De values, mean that the un-logged versions of the central and minimum age models currently offer the most suitable means of deriving accurate burial dose estimates for very young and modern-age samples. ?? 2009 Elsevier Ltd. All rights reserved.
Liederbach, Marijeanne; Dilgen, Faye E; Rose, Donald J
2008-09-01
Ballet and modern dance are jump-intensive activities, but little is known about the incidence of anterior cruciate ligament (ACL) injuries among dancers. Rigorous jump and balance training has been shown in some prospective studies to significantly reduce ACL injury rates among athletes. Dancers advance to the professional level only after having achieved virtuosic jump and balance technique. Therefore, dancers on the elite level may be at relatively low risk for ACL injury. Descriptive epidemiology study. Dance exposure, injuries, and injury conditions were systematically recorded at 4 dance organizations over 5 years. Select neuromuscular and psychometric variables were compared between and within ACL-injured and noninjured dancers. Of 298 dancers, 12 experienced an ACL injury over the 5-year period. The incidence of ACL injury was 0.009 per 1000 exposures. Landing from a jump onto 1 leg was the mechanism of injury in 92% of cases. Incidence was not statistically different between gender or dance groups, although women modern dancers had a 3 to 5 times greater relative risk than women ballet dancers and men dancers. No difference between ACL-injured and noninjured dancers emerged with regard to race, oral contraceptive use, or select musculoskeletal measures. Dancers suffer considerably fewer ACL injuries than athletes participating in team ball sports. The training dancers undertake to perfect lower extremity alignment, jump, and balance skills may serve to protect them against ACL injury. Anterior cruciate ligament injuries happened most often late in the day and season, suggesting an effect of fatigue.
Injuries in students of three different dance techniques.
Echegoyen, Soledad; Acuña, Eugenia; Rodríguez, Cristina
2010-06-01
As with any athlete, the dancer has a high risk for injury. Most studies carried out relate to classical and modern dance; however, there is a lack of reports on injuries involving other dance techniques. This study is an attempt to determine the differences in the incidence, the exposure-related rates, and the kind of injuries in three different dance techniques. A prospective study about dance injuries was carried out between 2004 and 2007 on students of modern, Mexican folkloric, and Spanish dance at the Escuela Nacional de Danza. A total of 1,168 injuries were registered in 444 students; the injury rate was 4 injuries/student for modern dance and 2 injuries/student for Mexican folkloric and Spanish dance. The rate per training hours was 4 for modern, 1.8 for Mexican folkloric, and 1.5 injuries/1,000 hr of training for Spanish dance. The lower extremity is the most frequent structure injured (70.47%), and overuse injuries comprised 29% of the total. The most frequent injuries were strain, sprain, back pain, and patellofemoral pain. This study has a consistent medical diagnosis of the injuries and is the first attempt in Mexico to compare the incidence of injuries in different dance techniques. To decrease the frequency of student injury, it is important to incorporate prevention programs into dance program curricula. More studies are necessary to define causes and mechanisms of injury, as well as an analysis of training methodology, to decrease the incidence of the muscle imbalances resulting in injury.
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
Van Bockstaele, Femke; Janssens, Ann; Piette, Anne; Callewaert, Filip; Pede, Valerie; Offner, Fritz; Verhasselt, Bruno; Philippé, Jan
2006-07-15
ZAP-70 has been proposed as a surrogate marker for immunoglobulin heavy-chain variable region (IgV(H)) mutation status, which is known as a prognostic marker in B-cell chronic lymphocytic leukemia (CLL). The flow cytometric analysis of ZAP-70 suffers from difficulties in standardization and interpretation. We applied the Kolmogorov-Smirnov (KS) statistical test to make analysis more straightforward. We examined ZAP-70 expression by flow cytometry in 53 patients with CLL. Analysis was performed as initially described by Crespo et al. (New England J Med 2003; 348:1764-1775) and alternatively by application of the KS statistical test comparing T cells with B cells. Receiver-operating-characteristics (ROC)-curve analyses were performed to determine the optimal cut-off values for ZAP-70 measured by the two approaches. ZAP-70 protein expression was compared with ZAP-70 mRNA expression measured by a quantitative PCR (qPCR) and with the IgV(H) mutation status. Both flow cytometric analyses correlated well with the molecular technique and proved to be of equal value in predicting the IgV(H) mutation status. Applying the KS test is reproducible, simple, straightforward, and overcomes a number of difficulties encountered in the Crespo-method. The KS statistical test is an essential part of the software delivered with modern routine analytical flow cytometers and is well suited for analysis of ZAP-70 expression in CLL. (c) 2006 International Society for Analytical Cytology.
ERIC Educational Resources Information Center
Ho, Andrew D.; Yu, Carol C.
2015-01-01
Many statistical analyses benefit from the assumption that unconditional or conditional distributions are continuous and normal. More than 50 years ago in this journal, Lord and Cook chronicled departures from normality in educational tests, and Micerri similarly showed that the normality assumption is met rarely in educational and psychological…
Running R Statistical Computing Environment Software on the Peregrine
for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing
ERIC Educational Resources Information Center
Colon-Berlingeri, Migdalisel; Burrowes, Patricia A.
2011-01-01
Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the…
[Artificial neural networks for decision making in urologic oncology].
Remzi, M; Djavan, B
2007-06-01
This chapter presents a detailed introduction regarding Artificial Neural Networks (ANNs) and their contribution to modern Urologic Oncology. It includes a description of ANNs methodology and points out the differences between Artifical Intelligence and traditional statistic models in terms of usefulness for patients and clinicians, and its advantages over current statistical analysis.
The Statistical Interpretation of Entropy: An Activity
ERIC Educational Resources Information Center
Timmberlake, Todd
2010-01-01
The second law of thermodynamics, which states that the entropy of an isolated macroscopic system can increase but will not decrease, is a cornerstone of modern physics. Ludwig Boltzmann argued that the second law arises from the motion of the atoms that compose the system. Boltzmann's statistical mechanics provides deep insight into the…
Forensic aspects of DNA-based human identity testing.
Roper, Stephen M; Tatum, Owatha L
2008-01-01
The forensic applications of DNA-based human identity laboratory testing are often underappreciated. Molecular biology has seen an exponential improvement in the accuracy and statistical power provided by identity testing in the past decade. This technology, dependent upon an individual's unique DNA sequence, has cemented the use of DNA technology in the forensic laboratory. This paper will discuss the state of modern DNA-based identity testing, describe the technology used to perform this testing, and describe its use as it relates to forensic applications. We will also compare individual technologies, including polymerase chain reaction (PCR) and Southern Blotting, that are used to detect the molecular differences that make all individuals unique. An increasing reliance on DNA-based identity testing dictates that healthcare providers develop an understanding of the background, techniques, and guiding principles of this important forensic tool.
Hathaway, John C.
1971-01-01
The purpose of the data file presented below is twofold: the first purpose is to make available in printed form the basic data relating to the samples collected as part of the joint U.S. Geological Survey - Woods Hole Oceanographic Institution program of study of the Atlantic continental margin of the United States; the second purpose is to maintain these data in a form that is easily retrievable by modern computer methods. With the data in such form, repeate manual transcription for statistical or similar mathematical treatment becomes unnecessary. Manual plotting of information or derivatives from the information may also be eliminated. Not only is handling of data by the computer considerably faster than manual techniques, but a fruitful source of errors, transcription mistakes, is eliminated.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlesinger, Adam M.
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Distance majorization and its applications.
Chi, Eric C; Zhou, Hua; Lange, Kenneth
2014-08-01
The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton's method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications.
[The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].
Carinci, F
2009-01-01
Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Landenburger, L.; Lawrence, R.L.; Podruzny, S.; Schwartz, C.C.
2008-01-01
Moderate resolution satellite imagery traditionally has been thought to be inadequate for mapping vegetation at the species level. This has made comprehensive mapping of regional distributions of sensitive species, such as whitebark pine, either impractical or extremely time consuming. We sought to determine whether using a combination of moderate resolution satellite imagery (Landsat Enhanced Thematic Mapper Plus), extensive stand data collected by land management agencies for other purposes, and modern statistical classification techniques (boosted classification trees) could result in successful mapping of whitebark pine. Overall classification accuracies exceeded 90%, with similar individual class accuracies. Accuracies on a localized basis varied based on elevation. Accuracies also varied among administrative units, although we were not able to determine whether these differences related to inherent spatial variations or differences in the quality of available reference data.
Certifying an Irreducible 1024-Dimensional Photonic State Using Refined Dimension Witnesses.
Aguilar, Edgar A; Farkas, Máté; Martínez, Daniel; Alvarado, Matías; Cariñe, Jaime; Xavier, Guilherme B; Barra, Johanna F; Cañas, Gustavo; Pawłowski, Marcin; Lima, Gustavo
2018-06-08
We report on a new class of dimension witnesses, based on quantum random access codes, which are a function of the recorded statistics and that have different bounds for all possible decompositions of a high-dimensional physical system. Thus, it certifies the dimension of the system and has the new distinct feature of identifying whether the high-dimensional system is decomposable in terms of lower dimensional subsystems. To demonstrate the practicability of this technique, we used it to experimentally certify the generation of an irreducible 1024-dimensional photonic quantum state. Therefore, certifying that the state is not multipartite or encoded using noncoupled different degrees of freedom of a single photon. Our protocol should find applications in a broad class of modern quantum information experiments addressing the generation of high-dimensional quantum systems, where quantum tomography may become intractable.
Certifying an Irreducible 1024-Dimensional Photonic State Using Refined Dimension Witnesses
NASA Astrophysics Data System (ADS)
Aguilar, Edgar A.; Farkas, Máté; Martínez, Daniel; Alvarado, Matías; Cariñe, Jaime; Xavier, Guilherme B.; Barra, Johanna F.; Cañas, Gustavo; Pawłowski, Marcin; Lima, Gustavo
2018-06-01
We report on a new class of dimension witnesses, based on quantum random access codes, which are a function of the recorded statistics and that have different bounds for all possible decompositions of a high-dimensional physical system. Thus, it certifies the dimension of the system and has the new distinct feature of identifying whether the high-dimensional system is decomposable in terms of lower dimensional subsystems. To demonstrate the practicability of this technique, we used it to experimentally certify the generation of an irreducible 1024-dimensional photonic quantum state. Therefore, certifying that the state is not multipartite or encoded using noncoupled different degrees of freedom of a single photon. Our protocol should find applications in a broad class of modern quantum information experiments addressing the generation of high-dimensional quantum systems, where quantum tomography may become intractable.
Modern dust aerosol availability in northwestern China.
Wang, Xunming; Cheng, Hong; Che, Huizheng; Sun, Jimin; Lu, Huayu; Qiang, Mingrui; Hua, Ting; Zhu, Bingqi; Li, Hui; Ma, Wenyong; Lang, Lili; Jiao, Linlin; Li, Danfeng
2017-08-18
The sources of modern dust aerosols and their emission magnitudes are fundamental for linking dust with climate and environment. Using field sample data, wind tunnel experiments and statistical analysis, we determined the contributions of wadis, gobi (stony desert), lakebeds, riverbeds, and interdunes to modern dust aerosol availability in the three important potential dust sources including the Tarim Basin, Qaidam Basin, and Ala Shan Plateau of China. The results show that riverbeds are the dominant landscape for modern dust aerosol availabilities in the Qaidam Basin, while wadis, gobi, and interdunes are the main landscapes over the Ala Shan Plateau and Tarim Basin. The Ala Shan Plateau and Tarim Basin are potential dust sources in northwestern China, while the Qaidam Basin is not a major source of the modern dust aerosols nowadays, and it is not acting in a significant way to the Loess Plateau presently. Moreover, most of modern dust aerosol emissions from China originated from aeolian processes with low intensities rather than from major dust events.
NASA Astrophysics Data System (ADS)
Nag, S. K.; Kundu, Anindita
2018-03-01
Demand of groundwater resources has increased manifold with population expansion as well as with the advent of modern civilization. Assessment, planning and management of groundwater resource are becoming crucial and extremely urgent in recent time. The study area belongs to Kashipur block, Purulia district, West Bengal. The area is characterized with dry climate and hard rock terrain. The objective of this study is to delineate groundwater potential zone for the assessment of groundwater availability using remote sensing, GIS and MCA techniques. Different thematic layers such as hydrogeomorphology, slope and lineament density maps have been transformed to raster data in TNT mips pro2012. To assign weights and ranks to different input factor maps, multi-influencing factor (MIF) technique has been used. The weights assigned to each factor have been computed statistically. Weighted index overlay modeling technique was used to develop a groundwater potential zone map with three weighted and scored parameters. Finally, the study area has been categorized into four distinct groundwater potential zones—excellent 1.5% (6.45 sq. km), good 53% (227.9 sq. km), moderate 45% (193.5 sq. km.) and poor 0.5% (2.15 sq. km). The outcome of the present study will help local authorities, researchers, decision makers and planners in formulating proper planning and management of groundwater resources in different hydrogeological situations.
Modern and Unconventional Approaches to Karst Hydrogeology
NASA Astrophysics Data System (ADS)
Sukop, M. C.
2017-12-01
Karst hydrogeology is frequently approached from a hydrograph/statistical perspective where precipitation/recharge inputs are converted to output hydrographs and the conversion process reflects the hydrology of the system. Karst catchments show hydrological response to short-term meteorological events and to long-term variation of large-scale atmospheric circulation. Modern approaches to analysis of these data include, for example, multiresolution wavelet techniques applied to understand relations between karst discharge and climate fields. Much less effort has been directed towards direct simulation of flow fields and transport phenomena in karst settings. This is primarily due to the lack of information on the detailed physical geometry of most karst systems. New mapping, sampling, and modeling techniques are beginning to enable direct simulation of flow and transport. A Conduit Flow Process (CFP) add-on to the USGS ModFlow model became available in 2007. FEFLOW and similar models are able to represent flows in individual conduits. Lattice Boltzmann models have also been applied to flow modeling in karst systems. Regarding quantitative measurement of karst system geometry, at scales to 0.1 m, X-ray computed tomography enables good detection of detailed (sub-millimeter) pore space in karstic rocks. Three-dimensional printing allows reconstruction of fragile high porosity rocks, and surrogate samples generated this way can then be subjected to laboratory testing. Borehole scales can be accessed with high-resolution ( 0.001 m) Digital Optical Borehole Imaging technologies and can provide virtual samples more representative of the true nature of karst aquifers than can obtained from coring. Subsequent extrapolation of such samples can generate three-dimensional models suitable for direct modeling of flow and transport. Finally, new cave mapping techniques are beginning to provide information than can be applied to direct simulation of flow. Due to flow rates and cave diameter, very high Reynolds number flows may be encountered.
Progress in tropical isotope dendroclimatology
NASA Astrophysics Data System (ADS)
Evans, M. N.; Schrag, D. P.; Poussart, P. F.; Anchukaitis, K. J.
2005-12-01
The terrestrial tropics remain an important gap in the growing high resolution proxy network used to characterize the mean state and variability of the hydrological cycle. Here we review early efforts to develop a new class of proxy paleorainfall/humidity indicators using intraseasonal to interannual-resolution stable isotope data from tropical trees. The approach invokes a recently published model of oxygen isotopic composition of alpha-cellulose, rapid methods for cellulose extraction from raw wood, and continuous flow isotope ratio mass spectrometry to develop proxy chronological, rainfall and growth rate estimates from tropical trees, even those lacking annual rings. Isotopically-derived age models may be confirmed for modern intervals using trees of known age, radiocarbon measurements, direct measurements of tree diameter, and time series replication. Studies are now underway at a number of laboratories on samples from Costa Rica, northwestern coastal Peru, Indonesia, Thailand, New Guinea, Paraguay, Brazil, India, and the South American Altiplano. Improved sample extraction chemistry and online pyrolysis techniques should increase sample throughput, precision, and time series replication. Statistical calibration together with simple forward modeling based on the well-observed modern period can provide for objective interpretation of the data. Ultimately, replicated data series with well-defined uncertainties can be entered into multiproxy efforts to define aspects of tropical hydrological variability associated with ENSO, the meridional overturning circulation, and the monsoon systems.
Loop shaping design for tracking performance in machine axes.
Schinstock, Dale E; Wei, Zhouhong; Yang, Tao
2006-01-01
A modern interpretation of classical loop shaping control design methods is presented in the context of tracking control for linear motor stages. Target applications include noncontacting machines such as laser cutters and markers, water jet cutters, and adhesive applicators. The methods are directly applicable to the common PID controller and are pertinent to many electromechanical servo actuators other than linear motors. In addition to explicit design techniques a PID tuning algorithm stressing the importance of tracking is described. While the theory behind these techniques is not new, the analysis of their application to modern systems is unique in the research literature. The techniques and results should be important to control practitioners optimizing PID controller designs for tracking and in comparing results from classical designs to modern techniques. The methods stress high-gain controller design and interpret what this means for PID. Nothing in the methods presented precludes the addition of feedforward control methods for added improvements in tracking. Laboratory results from a linear motor stage demonstrate that with large open-loop gain very good tracking performance can be achieved. The resultant tracking errors compare very favorably to results from similar motions on similar systems that utilize much more complicated controllers.
Crossroads: Modern Interactive Intersections and Accessible Pedestrian Signals
ERIC Educational Resources Information Center
Barlow, Janet M.; Franck, Lukas
2005-01-01
This article discusses the interactive nature of modern actuated intersections and the effect of that interface on pedestrians who are visually impaired. Information is provided about accessible pedestrian signals (APS), the role of blindness professionals in APS installation decisions, and techniques for crossing streets with APS.
Sample preparation for the analysis of isoflavones from soybeans and soy foods.
Rostagno, M A; Villares, A; Guillamón, E; García-Lafuente, A; Martínez, J A
2009-01-02
This manuscript provides a review of the actual state and the most recent advances as well as current trends and future prospects in sample preparation and analysis for the quantification of isoflavones from soybeans and soy foods. Individual steps of the procedures used in sample preparation, including sample conservation, extraction techniques and methods, and post-extraction treatment procedures are discussed. The most commonly used methods for extraction of isoflavones with both conventional and "modern" techniques are examined in detail. These modern techniques include ultrasound-assisted extraction, pressurized liquid extraction, supercritical fluid extraction and microwave-assisted extraction. Other aspects such as stability during extraction and analysis by high performance liquid chromatography are also covered.
NASA Astrophysics Data System (ADS)
Wang, Juan; Wang, Jian; Li, Lijuan; Zhou, Kun
2014-08-01
In order to solve the information fusion, process integration, collaborative design and manufacturing for ultra-precision optical elements within life-cycle management, this paper presents a digital management platform which is based on product data and business processes by adopting the modern manufacturing technique, information technique and modern management technique. The architecture and system integration of the digital management platform are discussed in this paper. The digital management platform can realize information sharing and interaction for information-flow, control-flow and value-stream from user's needs to offline in life-cycle, and it can also enhance process control, collaborative research and service ability of ultra-precision optical elements.
NASA Technical Reports Server (NTRS)
Schutz, Bob E.
1993-01-01
Satellite Laser Ranging (SLR) has a rich history of development which began in the 1960s with 10 meter-level first generation systems. These systems evolved with order of magnitude improvements to the systems that now produce several millimeter single shot range precisions. What began, in part, as an interesting application of the new laser technology has become an essential component of modern, precision space geodesy, which in turn enables contributions to a variety of science areas. Modern space geodesy is the beneficiary of technological developments which have enabled precision geodetic measurements. Aside from SLR and its closely related technique, Lunar Laser Ranging (LLR), Very Long Baseline Interferometry (VLBI) has made prominent science contributions also. In recent years, the Global Positioning System (GPS) has demonstrated a rapidly growing popularity as the result of demonstrated low cost with high precision instrumentation. Other modern techniques such as DORIS have demonstrated the ability to make significant science contributions; furthermore, PRARE can be expected to contribute in its own right. An appropriate question is 'why should several techniques be financially supported'? While there are several answers, I offer the opinion that, in consideration of the broad science areas that are the benefactors of space geodesy, no single technique can meet all the requirements and/or expectations of the science areas in which space geodesy contributes or has the potential for contributing. The more well-known science areas include plate tectonics, earthquake processes, Earth rotation/orientation, gravity (static and temporal), ocean circulation, land, and ice topography, to name a few applications. It is unfortunate that the modern space geodesy techniques are often viewed as competitive, but this view is usually encouraged by funding competition, especially in an era of growing needs but diminishing budgets. The techniques are, for the most part, complementary and the ability to reduce the data to geodetic parameters from several techniques promotes confidence in the geophysical interpretations. In the following sections, the current SLR applications are reviewed in the context of the other techniques. The strengths and limitations of SLR are reviewed and speculation about the future prospects are offered.
Seeing is believing: on the use of image databases for visually exploring plant organelle dynamics.
Mano, Shoji; Miwa, Tomoki; Nishikawa, Shuh-ichi; Mimura, Tetsuro; Nishimura, Mikio
2009-12-01
Organelle dynamics vary dramatically depending on cell type, developmental stage and environmental stimuli, so that various parameters, such as size, number and behavior, are required for the description of the dynamics of each organelle. Imaging techniques are superior to other techniques for describing organelle dynamics because these parameters are visually exhibited. Therefore, as the results can be seen immediately, investigators can more easily grasp organelle dynamics. At present, imaging techniques are emerging as fundamental tools in plant organelle research, and the development of new methodologies to visualize organelles and the improvement of analytical tools and equipment have allowed the large-scale generation of image and movie data. Accordingly, image databases that accumulate information on organelle dynamics are an increasingly indispensable part of modern plant organelle research. In addition, image databases are potentially rich data sources for computational analyses, as image and movie data reposited in the databases contain valuable and significant information, such as size, number, length and velocity. Computational analytical tools support image-based data mining, such as segmentation, quantification and statistical analyses, to extract biologically meaningful information from each database and combine them to construct models. In this review, we outline the image databases that are dedicated to plant organelle research and present their potential as resources for image-based computational analyses.
[Total knee arthroplasty in 2014 : Results, expectations, and complications].
Matziolis, G; Röhner, E
2015-04-01
Aseptic loosening seems to have become a minor problem in total knee arthroplasty. In contrast to that, new challenges are defined by changing patients' expectations. Beside reduction of pain and improving mobility, modern implants should not be noticed as such and should not limit sports activities. In this paper, a summary of the development and the current situation of total knee arthroplasty (e.g., implantation numbers, hospitality, operation time, and infection rates) are provided. The data are compared in an international context. In addition, current trends and developments from recent years are shown and rated according to the literature. The paper is based on a literature search (PubMed) and analyses of published official statistical data and expert recommendations. Implantation numbers have been declining gradually in Germany since 2009. In 2013, 127,077 total knee arthroplasties were implanted. In contrast, the number of revision operations has increased gradually during the last decade. In addition, hospital stay and operation time have declined. The development of implants, instruments, and operation techniques results from changing patients' expectations. All innovations must be compared against the results of well-proven techniques. The arthroplasty register may be an instrument to evaluate the results of new techniques and implants in a broad clinical application in terms of survival.
Geometric morphometrics in primatology: craniofacial variation in Homo sapiens and Pan troglodytes.
Lynch, J M; Wood, C G; Luboga, S A
1996-01-01
Traditionally, morphometric studies have relied on statistical analysis of distances, angles or ratios to investigate morphometric variation among taxa. Recently, geometric techniques have been developed for the direct analysis of landmark data. In this paper, we offer a summary (with examples) of three of these newer techniques, namely shape coordinate, thin-plate spline and relative warp analyses. Shape coordinate analysis detected significant craniofacial variation between 4 modern human populations, with African and Australian Aboriginal specimens being relatively prognathous compared with their Eurasian counterparts. In addition, the Australian specimens exhibited greater basicranial flexion than all other samples. The observed relationships between size and craniofacial shape were weak. The decomposition of shape variation into affine and non-affine components is illustrated via a thin-plate spline analysis of Homo and Pan cranial landmarks. We note differences between Homo and Pan in the degree of prognathism and basicranial flexion and the position and orientation of the foramen magnum. We compare these results with previous studies of these features in higher primates and discuss the utility of geometric morphometrics as a tool in primatology and physical anthropology. We conclude that many studies of morphological variation, both within and between taxa, would benefit from the graphical nature of these techniques.
Improving wave forecasting by integrating ensemble modelling and machine learning
NASA Astrophysics Data System (ADS)
O'Donncha, F.; Zhang, Y.; James, S. C.
2017-12-01
Modern smart-grid networks use technologies to instantly relay information on supply and demand to support effective decision making. Integration of renewable-energy resources with these systems demands accurate forecasting of energy production (and demand) capacities. For wave-energy converters, this requires wave-condition forecasting to enable estimates of energy production. Current operational wave forecasting systems exhibit substantial errors with wave-height RMSEs of 40 to 60 cm being typical, which limits the reliability of energy-generation predictions thereby impeding integration with the distribution grid. In this study, we integrate physics-based models with statistical learning aggregation techniques that combine forecasts from multiple, independent models into a single "best-estimate" prediction of the true state. The Simulating Waves Nearshore physics-based model is used to compute wind- and currents-augmented waves in the Monterey Bay area. Ensembles are developed based on multiple simulations perturbing input data (wave characteristics supplied at the model boundaries and winds) to the model. A learning-aggregation technique uses past observations and past model forecasts to calculate a weight for each model. The aggregated forecasts are compared to observation data to quantify the performance of the model ensemble and aggregation techniques. The appropriately weighted ensemble model outperforms an individual ensemble member with regard to forecasting wave conditions.
Modern approaches to the treatment of human infertility through assisted reproduction.
Fernández Pelegrina, R; Kessler, A G; Rawlins, R G
1991-08-01
Medical statistics from the United States show approximately 15 percent of all couples of reproductive age are unable to conceive naturally. In recent years, the numbers of couples with reproductive problems has increased, principally due to changes in life style and delayed childbearing. Only 13 years after the birth of the first "test tube baby", advances in the field of human reproduction have created a wide range of alternatives to help infertile couples conceive a healthy infant. Together, these techniques are called Assisted Reproductive Technology (ART) and include: in vitro fertilization (IVF), intratubal transfer of gametes (GIFT), intratubal transfer of zygotes (ZIFT), tubal transfer of preimplantation embryos (TET), gamete or embryo donation, cryopreservtion, and micromanipulation. The application of these techniques is presented here. While much remains to be learned, the ability to fertilize ova in vitro and sustain early embryonic life outside the body is now a reality. Contrary to the idea that these techniques create life in vitro, they simply remove barriers caused by different forms of infertility which impede the creation of life. More than 30,000 infants have now been produced world-wide through ART. In the future, new developments in the field of assisted reproduction promise to bring new hope to the growing numbers of infertile couples around the world.
A Simple Laser Microphone for Classroom Demonstration
ERIC Educational Resources Information Center
Moses, James M.; Trout, K. P.
2006-01-01
Communication through the modulation of electromagnetic radiation has become a foundational technique in modern technology. In this paper we discuss a modern day method of eavesdropping based upon the modulation of laser light reflected from a window pane. A simple and affordable classroom demonstration of a "laser microphone" is…
Björkstén, Karin S; Bjerregaard, Peter
2015-07-04
There is growing evidence that living conditions at birth play a role in medical conditions later in life. Population-based studies from the Northern Hemisphere have shown that persons born in the spring or summer are at greater risk of committing suicide. A statistical correlation with light availability at birth has been observed in past research, but the cause remains unknown. Greenland is one of the most extreme of natural human habitats with regard to seasonal changes in light. The combination of rapid social changes and reliable population statistics offers a unique opportunity to make comparisons between persons born into a Traditional Lifestyle and those born into a Modern Lifestyle. The aim of this work was to assess whether season of birth differed between suicide victims born into an old or into a modern lifestyle. Official population and mortality registers were used. Suicide victims born (1903-1950) into the Traditional Lifestyle were compared with those born into the Modern Lifestyle (1961-1980). Rayleigh's test for circular distributions was used to assess the season of birth in suicide victims. Data regarding season of birth in the general population were collected. Persons born in March-June in the Traditional Lifestyle were much less likely to commit suicide than those born during other periods of the year. This is contrary to the findings of other studies. The seasonal differences had disappeared for those born into the Modern Lifestyle. The suicide rate increased from very low rates to about 140 suicides/100 000 person-years in the 1980s. The reason behind a variation in season of birth in suicide victims born into the old lifestyle is unknown. It is also unknown why the seasonal difference had disappeared with modern lifestyle. Possible influence of artificial light, nutrition, microbiota and seasonal infections are discussed. The underlying causes behind suicides may be different in traditional and modern Greenland.
A morphometric analysis of maxillary molar crowns of Middle-Late Pleistocene hominins.
Bailey, Shara E
2004-09-01
This study explores the significance of shape differences in the maxillary first molar crowns of Neandertals and anatomically modern humans. It uses morphometric analysis to quantify these differences and to investigate how the orientation of major cusps, relative cusp base areas and occlusal polygon area influence crown shape. The aims of this study were to 1) quantify these data to test whether the tooth shapes of Neandertals and anatomically modern humans differ significantly and 2) to explore if either of the shapes is derived relative to earlier fossil hominins. Data were collected from digital occlusal photographs using image-processing software. Cusp angles, relative cusp base areas and occlusal polygon areas were measured on Neandertals (n=15), contemporary modern humans (n=62), Upper Paleolithic humans (n=6), early anatomically modern humans (n=3) and Homo erectus (n=3). Univariate and multivariate statistical tests were used to evaluate the differences between contemporary modern humans and Neandertals, while the much sparser data sets from the other fossil samples were included primarily for comparison. Statistically significant differences reflecting overall crown shape and internal placement of the crown apices were found. Neandertals are distinguished from contemporary humans by possessing maxillary first molars that 1) are markedly skewed; 2) possess a narrower distal segment of the occlusal polygon compared to the mesial segment; 3) possess a significantly smaller metacone and a significantly larger hypocone; and 4) possess a significantly smaller relative occlusal polygon area reflecting internally placed cusps. Differences in relative cusp base areas of the hypocone and metacone may contribute to the shape differences observed in Neandertals. However, early anatomically modern humans possessing a pattern of relative cusp base areas similar to Neandertals lack their unusual shape. That the morphology observed in non-Neandertal fossil hominins is more anatomically modern human-like than Neandertal-like, suggests that this distinctive morphology may be derived in Neandertals.
Charles E. Land, Ph.D., acclaimed statistical expert on radiation risk assessment, died January 2018
Charles E. Land, Ph.D., an internationally acclaimed statistical expert on radiation risk assessment, died January 25, 2018. He retired in 2009 from the NCI Division of Cancer Epidemiology and Genetics. Dr. Land performed pioneering work in modern radiation dose-response analysis and modeling of low-dose cancer risk.
ERIC Educational Resources Information Center
Gordon, Sheldon P.; Gordon, Florence S.
2010-01-01
One of the most important applications of the definite integral in a modern calculus course is the mean value of a function. Thus, if a function "f" is defined on an interval ["a", "b"], then the mean, or average value, of "f" is given by [image omitted]. In this note, we will investigate the meaning of other statistics associated with a function…
ERIC Educational Resources Information Center
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and…
ERIC Educational Resources Information Center
Larson-Hall, Jenifer; Herrington, Richard
2010-01-01
In this article we introduce language acquisition researchers to two broad areas of applied statistics that can improve the way data are analyzed. First we argue that visual summaries of information are as vital as numerical ones, and suggest ways to improve them. Specifically, we recommend choosing boxplots over barplots and adding locally…
Quantifying falsifiability of scientific theories
NASA Astrophysics Data System (ADS)
Nemenman, Ilya
I argue that the notion of falsifiability, a key concept in defining a valid scientific theory, can be quantified using Bayesian Model Selection, which is a standard tool in modern statistics. This relates falsifiability to the quantitative version of the statistical Occam's razor, and allows transforming some long-running arguments about validity of scientific theories from philosophical discussions to rigorous mathematical calculations.
Modern Observational Techniques for Comets
NASA Technical Reports Server (NTRS)
Brandt, J. C. (Editor); Greenberg, J. M. (Editor); Donn, B. (Editor); Rahe, J. (Editor)
1981-01-01
Techniques are discussed in the following areas: astrometry, photometry, infrared observations, radio observations, spectroscopy, imaging of coma and tail, image processing of observation. The determination of the chemical composition and physical structure of comets is highlighted.
Papaneophytou, Christos P; Kontopidis, George
2014-02-01
The supply of many valuable proteins that have potential clinical or industrial use is often limited by their low natural availability. With the modern advances in genomics, proteomics and bioinformatics, the number of proteins being produced using recombinant techniques is exponentially increasing and seems to guarantee an unlimited supply of recombinant proteins. The demand of recombinant proteins has increased as more applications in several fields become a commercial reality. Escherichia coli (E. coli) is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, producing soluble proteins in E. coli is still a major bottleneck for structural biology projects. One of the most challenging steps in any structural biology project is predicting which protein or protein fragment will express solubly and purify for crystallographic studies. The production of soluble and active proteins is influenced by several factors including expression host, fusion tag, induction temperature and time. Statistical designed experiments are gaining success in the production of recombinant protein because they provide information on variable interactions that escape the "one-factor-at-a-time" method. Here, we review the most important factors affecting the production of recombinant proteins in a soluble form. Moreover, we provide information about how the statistical design experiments can increase protein yield and purity as well as find conditions for crystal growth. Copyright © 2013 Elsevier Inc. All rights reserved.
From experimental imaging techniques to virtual embryology.
Weninger, Wolfgang J; Tassy, Olivier; Darras, Sébastien; Geyer, Stefan H; Thieffry, Denis
2004-01-01
Modern embryology increasingly relies on descriptive and functional three dimensional (3D) and four dimensional (4D) analysis of physically, optically, or virtually sectioned specimens. To cope with the technical requirements, new methods for high detailed in vivo imaging, as well as the generation of high resolution digital volume data sets for the accurate visualisation of transgene activity and gene product presence, in the context of embryo morphology, were recently developed and are under construction. These methods profoundly change the scientific applicability, appearance and style of modern embryo representations. In this paper, we present an overview of the emerging techniques to create, visualise and administrate embryo representations (databases, digital data sets, 3-4D embryo reconstructions, models, etc.), and discuss the implications of these new methods on the work of modern embryologists, including, research, teaching, the selection of specific model organisms, and potential collaborators.
Fazenda, Bruno; Scarre, Chris; Till, Rupert; Pasalodos, Raquel Jiménez; Guerra, Manuel Rojo; Tejedor, Cristina; Peredo, Roberto Ontañón; Watson, Aaron; Wyatt, Simon; Benito, Carlos García; Drinkall, Helen; Foulds, Frederick
2017-09-01
During the 1980 s, acoustic studies of Upper Palaeolithic imagery in French caves-using the technology then available-suggested a relationship between acoustic response and the location of visual motifs. This paper presents an investigation, using modern acoustic measurement techniques, into such relationships within the caves of La Garma, Las Chimeneas, La Pasiega, El Castillo, and Tito Bustillo in Northern Spain. It addresses methodological issues concerning acoustic measurement at enclosed archaeological sites and outlines a general framework for extraction of acoustic features that may be used to support archaeological hypotheses. The analysis explores possible associations between the position of visual motifs (which may be up to 40 000 yrs old) and localized acoustic responses. Results suggest that motifs, in general, and lines and dots, in particular, are statistically more likely to be found in places where reverberation is moderate and where the low frequency acoustic response has evidence of resonant behavior. The work presented suggests that an association of the location of Palaeolithic motifs with acoustic features is a statistically weak but tenable hypothesis, and that an appreciation of sound could have influenced behavior among Palaeolithic societies of this region.
Developments in flow visualization methods for flight research
NASA Technical Reports Server (NTRS)
Holmes, Bruce J.; Obara, Clifford J.; Manuel, Gregory S.; Lee, Cynthia C.
1990-01-01
With the introduction of modern airplanes utilizing laminar flow, flow visualization has become an important diagnostic tool in determining aerodynamic characteristics such as surface flow direction and boundary-layer state. A refinement of the sublimating chemical technique has been developed to define both the boundary-layer transition location and the transition mode. In response to the need for flow visualization at subsonic and transonic speeds and altitudes above 20,000 feet, the liquid crystal technique has been developed. A third flow visualization technique that has been used is infrared imaging, which offers non-intrusive testing over a wide range of test conditions. A review of these flow visualization methods and recent flight results is presented for a variety of modern aircraft and flight conditions.
Elliott, Robert E; Tanweer, Omar; Smith, Michael L; Frempong-Boadu, Anthony
2015-08-01
Structured review of literature and application of meta-analysis statistical techniques. Review published series describing clinical and radiographic outcomes of patients treated with C1 lateral mass screws (C1LMS), specifically analyzing the impact of starting point and bicortical purchase on successful atlantoaxial arthrodesis. Biomechanical studies suggest posterior arch screws and C1LMS with bicortical purchase are stronger than screws placed within the center of the lateral mass or those with unicortical purchase. Online databases were searched for English-language articles between 1994 and 2012 describing posterior atlantal instrumentation with C1LMS. Thirty-four studies describing 1247 patients having posterior atlantoaxial fusion with C1LMS met inclusion criteria. All studies provided class III evidence. Arthrodesis was quite successful regardless of technique (99.0% overall). Meta-analysis and multivariate regression analyses showed that neither posterior arch starting point nor bicortical screw purchase translated into a higher rate of successful arthrodesis. There were no complications from bicortical screw purchase. The Goel-Harms technique is a very safe and successful technique for achieving atlantoaxial fusion, regardless of minor variations in C1LMS technique. Although biomechanical studies suggest markedly increased rigidity of bicortical and posterior arch C1LMS, the significance of these findings may be minimal in the clinical setting of atlantoaxial fixation and fusion with modern techniques. The decision to use either technique must be made after careful review of the preoperative multiplanar computed tomography imaging, assessment of the unique anatomy of each patient, and the demands of the clinical scenario such as bone quality.
Modified Endonasal Tongue-in-Groove Technique.
Kadakia, Sameep; Ovchinsky, Alexander
2016-10-01
Achieving stable and desirable changes in tip rotation (TR) and tip projection (TP) is among the primary goals of modern day rhinoplasty. The tongue-in-groove (TIG) technique is one technique in rhinoplasty used to improve TR and/or TP. Performing TIG endonasally using a permanent suture can be quite cumbersome as the suture needs to be buried under the skin. We describe a variation of TIG technique for endonasal rhinoplasty using a permanent suture buried in small columellar skin incisions. The technique details are described and the postoperative changes in TR and TP are analyzed for the degree of change and longevity. A retrospective review of the preoperative and postoperative photographs of 12 patients treated with the endonasal TIG technique were analyzed for changes in TR and TP. Out of 12 patients, there were seven females (58.3%) and five males (41.7%), with age ranging from 17 to 49 years. The follow-up ranged from 6 months to 53 months, with mean follow-up of 12.1 months. All patients were treated by the senior author in a major New York City hospital. Postoperative changes in TR and TP were compared by measuring the nasolabial angle as well as the Goode ratio using a photo editing software. Using a t-test and a p-value criteria of 0.05, the difference between the preoperative and postoperative TR (p = 0.0069) and TP (p = 0.026) was found to be statistically significant. None of the study patients developed any complications related to the use of a permanent suture material during the follow-up period. Our modified TIG technique is a quick, reliable, and safe option in the surgical armamentarium to achieve desired changes in TR and/or TP. 4. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Instrumentation and fusion for congenital spine deformities.
Hedequist, Daniel J
2009-08-01
A retrospective clinical review. To review the use of modern instrumentation of the spine for congenital spinal deformities. Spinal instrumentation has evolved since the advent of the Harrington rod. There is a paucity of literature, which discusses the use of modern spinal instrumentation in congenital spine deformity cases. This review focuses on modern instrumentation techniques for congenital scoliosis and kyphosis. A systematic review was performed of the literature to discuss spinal implant use for congenital deformities. Spinal instrumentation may be safely and effectively used in cases of congenital spinal deformity. Spinal surgeons taking care of children with congenital spine deformities need to be trained in all aspects of modern spinal instrumentation.
How Farmers Learn about Environmental Issues: Reflections on a Sociobiographical Approach
ERIC Educational Resources Information Center
Vandenabeele, Joke; Wildemeersch, Danny
2012-01-01
At the time of this research, protests of farmers against new environmental policy measures received much media attention. News reports suggested that farmers' organizations rejected the idea that modern farming techniques cause damage to the environment and even tried to undermine attempts to reconcile the goals of modern agriculture with…
Older Learning Engagement in the Modern City
ERIC Educational Resources Information Center
Lido, Catherine; Osborne, Michael; Livingston, Mark; Thakuriah, Piyushimita; Sila-Nowicka, Katarzyna
2016-01-01
This research employs novel techniques to examine older learners' journeys, educationally and physically, in order to gain a "three-dimensional" picture of lifelong learning in the modern urban context of Glasgow. The data offers preliminary analyses of an ongoing 1,500 household survey by the Urban Big Data Centre (UBDC). A sample of…
Commodification of Ghana's Volta River: An Example of Ellul's Autonomy of Technique
ERIC Educational Resources Information Center
Agbemabiese, Lawrence; Byrne, John
2005-01-01
Jacques Ellul argued that modernity's nearly exclusive reliance on science and technology to design society would threaten human freedom. Of particular concern for Ellul was the prospect of the technical milieu overwhelming culture. The commodification of the Volta River in order to modernize Ghana illustrates the Ellulian dilemma of the autonomy…
Modern Methodology and Techniques Aimed at Developing the Environmentally Responsible Personality
ERIC Educational Resources Information Center
Ponomarenko, Yelena V.; Zholdasbekova, Bibisara A.; Balabekov, Aidarhan T.; Kenzhebekova, Rabiga I.; Yessaliyev, Aidarbek A.; Larchenkova, Liudmila A.
2016-01-01
The article discusses the positive impact of an environmentally responsible individual as the social unit able to live in harmony with the natural world, himself/herself and other people. The purpose of the article is to provide theoretical substantiation of modern teaching methods. The authors considered the experience of philosophy, psychology,…
Pape, G; Raiss, P; Kleinschmidt, K; Schuld, C; Mohr, G; Loew, M; Rickert, M
2010-12-01
Loosening of the glenoid component is one of the major causes of failure in total shoulder arthroplasty. Possible risk factors for loosening of cemented components include an eccentric loading, poor bone quality, inadequate cementing technique and insufficient cement penetration. The application of a modern cementing technique has become an established procedure in total hip arthroplasty. The goal of modern cementing techniques in general is to improve the cement-penetration into the cancellous bone. Modern cementing techniques include the cement vacuum-mixing technique, retrograde filling of the cement under pressurisation and the use of a pulsatile lavage system. The main purpose of this study was to analyse cement penetration into the glenoid bone by using modern cement techniques and to investigate the relationship between the bone mineral density (BMD) and the cement penetration. Furthermore we measured the temperature at the glenoid surface before and after jet-lavage of different patients during total shoulder arthroplasty. It is known that the surrounding temperature of the bone has an effect on the polymerisation of the cement. Data from this experiment provide the temperature setting for the in-vitro study. The glenoid surface temperature was measured in 10 patients with a hand-held non-contact temperature measurement device. The bone mineral density was measured by DEXA. Eight paired cadaver scapulae were allocated (n = 16). Each pair comprised two scapulae from one donor (matched-pair design). Two different glenoid components were used, one with pegs and the other with a keel. The glenoids for the in-vitro study were prepared with the bone compaction technique by the same surgeon in all cases. Pulsatile lavage was used to clean the glenoid of blood and bone fragments. Low viscosity bone cement was applied retrogradely into the glenoid by using a syringe. A constant pressure was applied with a modified force sensor impactor. Micro-computed tomography scans were applied to analyse the cement penetration into the cancellous bone. The mean temperature during the in-vivo arthroplasty of the glenoid was 29.4 °C (27.2-31 °C) before and 26.2 °C (25-27.5 °C) after jet-lavage. The overall peak BMD was 0.59 (range 0.33-0.99) g/cm (2). Mean cement penetration was 107.9 (range 67.6-142.3) mm (2) in the peg group and 128.3 (range 102.6-170.8) mm (2) in the keel group. The thickness of the cement layer varied from 0 to 2.1 mm in the pegged group and from 0 to 2.4 mm in the keeled group. A strong negative correlation between BMD and mean cement penetration was found for the peg group (r (2) = -0.834; p < 0.01) and for the keel group (r (2) = -0.727; p < 0.041). Micro-CT shows an inhomogenous dispersion of the cement into the cancellous bone. Data from the in-vivo temperature measurement indicate that the temperature at the glenohumeral surface under operation differs from the body core temperature and should be considered in further in-vitro studies with human specimens. Bone mineral density is negatively correlated to cement penetration in the glenoid. The application of a modern cementing technique in the glenoid provides sufficient cementing penetration although there is an inhomogenous dispersion of the cement. The findings of this study should be considered in further discussions about cementing technique and cement penetration into the cancellous bone of the glenoid. © Georg Thieme Verlag KG Stuttgart · New York.
Cache Energy Optimization Techniques For Modern Processors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh
2013-01-01
Modern multicore processors are employing large last-level caches, for example Intel's E7-8800 processor uses 24MB L3 cache. Further, with each CMOS technology generation, leakage energy has been dramatically increasing and hence, leakage energy is expected to become a major source of energy dissipation, especially in last-level caches (LLCs). The conventional schemes of cache energy saving either aim at saving dynamic energy or are based on properties specific to first-level caches, and thus these schemes have limited utility for last-level caches. Further, several other techniques require offline profiling or per-application tuning and hence are not suitable for product systems. In thismore » book, we present novel cache leakage energy saving schemes for single-core and multicore systems; desktop, QoS, real-time and server systems. Also, we present cache energy saving techniques for caches designed with both conventional SRAM devices and emerging non-volatile devices such as STT-RAM (spin-torque transfer RAM). We present software-controlled, hardware-assisted techniques which use dynamic cache reconfiguration to configure the cache to the most energy efficient configuration while keeping the performance loss bounded. To profile and test a large number of potential configurations, we utilize low-overhead, micro-architecture components, which can be easily integrated into modern processor chips. We adopt a system-wide approach to save energy to ensure that cache reconfiguration does not increase energy consumption of other components of the processor. We have compared our techniques with state-of-the-art techniques and have found that our techniques outperform them in terms of energy efficiency and other relevant metrics. The techniques presented in this book have important applications in improving energy-efficiency of higher-end embedded, desktop, QoS, real-time, server processors and multitasking systems. This book is intended to be a valuable guide for both newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.« less
Craniofacial morphology in ancient and modern Greeks through 4,000 years.
Papagrigorakis, Manolis J; Kousoulis, Antonis A; Synodinos, Philippos N
2014-01-01
Multiple 20th century studies have speculated on the anthropological similarities of the modern inhabitants of Greece with their ancient predecessors. The present investigation attempts to add to this knowledge by comparing the craniofacial configuration of 141 ancient (dating around 2,000-500 BC) and 240 modern Greek skulls (the largest material among relevant national studies). Skulls were grouped in age at death, sex, era and geographical categories; lateral cephalograms were taken and 53 variables were measured and correlated statistically. The craniofacial measurements and measurements of the basic quadrilateral and cranial polygon were compared in various groups using basic statistical methods, one-way ANOVA and assessment of the correlation matrices. Most of the measurements for both sexes combined followed an akin pattern in ancient and modern Greek skulls. Moreover, sketching and comparing the outline of the skull and upper face, we observed a clock-wise movement. The present study confirms that the morphological pattern of Greek skulls, as it changed during thousands of years, kept some characteristics unchanged, with others undergoing logical modifications. The analysis of our results allows us to believe that the influence upon the craniofacial complex of the various known factors, including genetic or environmental alterations, is apt to alter its form to adapt to new conditions. Even though 4,000 years seems too narrow a span to provoke evolutionary insights using conventional geometric morphometrics, the full presentation of our results makes up a useful atlas of solid data. Interpreted with caution, the craniofacial morphology in modern and ancient Greeks indicates elements of ethnic group continuation within the unavoidable multicultural mixtures.
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
Karakaya, Jale; Karabulut, Erdem; Yucel, Recai M.
2015-01-01
Modern statistical methods using incomplete data have been increasingly applied in a wide variety of substantive problems. Similarly, receiver operating characteristic (ROC) analysis, a method used in evaluating diagnostic tests or biomarkers in medical research, has also been increasingly popular problem in both its development and application. While missing-data methods have been applied in ROC analysis, the impact of model mis-specification and/or assumptions (e.g. missing at random) underlying the missing data has not been thoroughly studied. In this work, we study the performance of multiple imputation (MI) inference in ROC analysis. Particularly, we investigate parametric and non-parametric techniques for MI inference under common missingness mechanisms. Depending on the coherency of the imputation model with the underlying data generation mechanism, our results show that MI generally leads to well-calibrated inferences under ignorable missingness mechanisms. PMID:26379316
Distance majorization and its applications
Chi, Eric C.; Zhou, Hua; Lange, Kenneth
2014-01-01
The problem of minimizing a continuously differentiable convex function over an intersection of closed convex sets is ubiquitous in applied mathematics. It is particularly interesting when it is easy to project onto each separate set, but nontrivial to project onto their intersection. Algorithms based on Newton’s method such as the interior point method are viable for small to medium-scale problems. However, modern applications in statistics, engineering, and machine learning are posing problems with potentially tens of thousands of parameters or more. We revisit this convex programming problem and propose an algorithm that scales well with dimensionality. Our proposal is an instance of a sequential unconstrained minimization technique and revolves around three ideas: the majorization-minimization principle, the classical penalty method for constrained optimization, and quasi-Newton acceleration of fixed-point algorithms. The performance of our distance majorization algorithms is illustrated in several applications. PMID:25392563
Immediate-type hypersensitivity reactions and hypnosis: problems in methodology.
Laidlaw, T M; Richardson, D H; Booth, R J; Large, R G
1994-08-01
Hypnosis has been used to ameliorate skin test reactivity in studies dating back to the 1930s. This study using modern methodology and statistical analyses sets out to test the hypothesis that it was possible to decrease reactions to histamine by hypnotic suggestion. Five subjects, all asthmatic and untrained in hypnosis, were given three hypnotic sessions where they were asked to control their reactions to histamine administered by the Pepys technique to forearm skin. These sessions were to be compared with three non-hypnotic sessions. The flare sizes but not wheal sizes were found to be significantly reduced after the hypnosis sessions, compared to sessions without hypnosis. Skin temperature was correlated with the size of reactions. The day upon which the sessions took place contributed significant amounts of the remaining unexplained variance, giving rise to questions about what could cause these day to day changes.
Cardiac data mining (CDM); organization and predictive analytics on biomedical (cardiac) data
NASA Astrophysics Data System (ADS)
Bilal, M. Musa; Hussain, Masood; Basharat, Iqra; Fatima, Mamuna
2013-10-01
Data mining and data analytics has been of immense importance to many different fields as we witness the evolution of data sciences over recent years. Biostatistics and Medical Informatics has proved to be the foundation of many modern biological theories and analysis techniques. These are the fields which applies data mining practices along with statistical models to discover hidden trends from data that comprises of biological experiments or procedures on different entities. The objective of this research study is to develop a system for the efficient extraction, transformation and loading of such data from cardiologic procedure reports given by Armed Forces Institute of Cardiology. It also aims to devise a model for the predictive analysis and classification of this data to some important classes as required by cardiologists all around the world. This includes predicting patient impressions and other important features.
Universality and predictability in molecular quantitative genetics.
Nourmohammad, Armita; Held, Torsten; Lässig, Michael
2013-12-01
Molecular traits, such as gene expression levels or protein binding affinities, are increasingly accessible to quantitative measurement by modern high-throughput techniques. Such traits measure molecular functions and, from an evolutionary point of view, are important as targets of natural selection. We review recent developments in evolutionary theory and experiments that are expected to become building blocks of a quantitative genetics of molecular traits. We focus on universal evolutionary characteristics: these are largely independent of a trait's genetic basis, which is often at least partially unknown. We show that universal measurements can be used to infer selection on a quantitative trait, which determines its evolutionary mode of conservation or adaptation. Furthermore, universality is closely linked to predictability of trait evolution across lineages. We argue that universal trait statistics extends over a range of cellular scales and opens new avenues of quantitative evolutionary systems biology. Copyright © 2013. Published by Elsevier Ltd.
Understanding climate: A strategy for climate modeling and predictability research, 1985-1995
NASA Technical Reports Server (NTRS)
Thiele, O. (Editor); Schiffer, R. A. (Editor)
1985-01-01
The emphasis of the NASA strategy for climate modeling and predictability research is on the utilization of space technology to understand the processes which control the Earth's climate system and it's sensitivity to natural and man-induced changes and to assess the possibilities for climate prediction on time scales of from about two weeks to several decades. Because the climate is a complex multi-phenomena system, which interacts on a wide range of space and time scales, the diversity of scientific problems addressed requires a hierarchy of models along with the application of modern empirical and statistical techniques which exploit the extensive current and potential future global data sets afforded by space observations. Observing system simulation experiments, exploiting these models and data, will also provide the foundation for the future climate space observing system, e.g., Earth observing system (EOS), 1985; Tropical Rainfall Measuring Mission (TRMM) North, et al. NASA, 1984.
The effects of modern cementing techniques on the longevity of total hip arthroplasty.
Poss, R; Brick, G W; Wright, R J; Roberts, D W; Sledge, C B
1988-07-01
Modern prosthetic design and cementing techniques have dramatically improved femoral component fixation. Compared to studies reported in the 1970s, the incidence of radiographic loosening for periods up to 5 years postoperatively has been reduced by at least a factor of 10. These results are the benchmark by which alternative forms of femoral component fixation must be measured. With the likelihood of increased longevity of total hip arthroplasty resulting from improved fixation, the problems of wear debris from the bearing surfaces and loss of bone stock with time will become preeminent.
Lecomte, Dominique; Plu, Isabelle; Froment, Alain
2012-06-01
Forensic examination is often requested when skeletal remains are discovered. Detailed visual observation can provide much information, such as the human or animal origin, sex, age, stature, and ancestry, and approximate time since death. New three-dimensional imaging techniques can provide further information (osteometry, facial reconstruction). Bone chemistry, and particularly measurement of stable or unstable carbon and nitrogen isotopes, yields information on diet and time since death, respectively. Genetic analyses of ancient DNA are also developing rapidly. Although seldom used in a judicial context, these modern anthropologic techniques are nevertheless available for the most complex cases.
Comparison of US Antarctic Meteorite Collection to Other Cold and Hot Deserts and Modern Falls
NASA Technical Reports Server (NTRS)
McBride, K. M.; Righter, K.
2010-01-01
The US Antarctic meteorite collection has grown close to 18,000 specimens, over 16,000 of which have been classified. Because of this growth, the parallel growth of Antarctic meteorite collections by Japan and China, and also the hot desert collections (from Africa and Australia), we will update the statistical overview of the US collection (last done in 1990 [1]), and make comparisons to other collections and modern falls.
Ucchesu, Mariano; Orrù, Martino; Grillo, Oscar; Venora, Gianfranco; Paglietti, Giacomo; Ardu, Andrea; Bacchetta, Gianluigi
2016-01-01
The identification of archaeological charred grape seeds is a difficult task due to the alteration of the morphological seeds shape. In archaeobotanical studies, for the correct discrimination between Vitis vinifera subsp. sylvestris and Vitis vinifera subsp. vinifera grape seeds it is very important to understand the history and origin of the domesticated grapevine. In this work, different carbonisation experiments were carried out using a hearth to reproduce the same burning conditions that occurred in archaeological contexts. In addition, several carbonisation trials on modern wild and cultivated grape seeds were performed using a muffle furnace. For comparison with archaeological materials, modern grape seed samples were obtained using seven different temperatures of carbonisation ranging between 180 and 340ºC for 120 min. Analysing the grape seed size and shape by computer vision techniques, and applying the stepwise linear discriminant analysis (LDA) method, discrimination of the wild from the cultivated charred grape seeds was possible. An overall correct classification of 93.3% was achieved. Applying the same statistical procedure to compare modern charred with archaeological grape seeds, found in Sardinia and dating back to the Early Bronze Age (2017–1751 2σ cal. BC), allowed 75.0% of the cases to be identified as wild grape. The proposed method proved to be a useful and effective procedure in identifying, with high accuracy, the charred grape seeds found in archaeological sites. Moreover, it may be considered valid support for advances in the knowledge and comprehension of viticulture adoption and the grape domestication process. The same methodology may also be successful when applied to other plant remains, and provide important information about the history of domesticated plants. PMID:26901361
Ucchesu, Mariano; Orrù, Martino; Grillo, Oscar; Venora, Gianfranco; Paglietti, Giacomo; Ardu, Andrea; Bacchetta, Gianluigi
2016-01-01
The identification of archaeological charred grape seeds is a difficult task due to the alteration of the morphological seeds shape. In archaeobotanical studies, for the correct discrimination between Vitis vinifera subsp. sylvestris and Vitis vinifera subsp. vinifera grape seeds it is very important to understand the history and origin of the domesticated grapevine. In this work, different carbonisation experiments were carried out using a hearth to reproduce the same burning conditions that occurred in archaeological contexts. In addition, several carbonisation trials on modern wild and cultivated grape seeds were performed using a muffle furnace. For comparison with archaeological materials, modern grape seed samples were obtained using seven different temperatures of carbonisation ranging between 180 and 340ºC for 120 min. Analysing the grape seed size and shape by computer vision techniques, and applying the stepwise linear discriminant analysis (LDA) method, discrimination of the wild from the cultivated charred grape seeds was possible. An overall correct classification of 93.3% was achieved. Applying the same statistical procedure to compare modern charred with archaeological grape seeds, found in Sardinia and dating back to the Early Bronze Age (2017-1751 2σ cal. BC), allowed 75.0% of the cases to be identified as wild grape. The proposed method proved to be a useful and effective procedure in identifying, with high accuracy, the charred grape seeds found in archaeological sites. Moreover, it may be considered valid support for advances in the knowledge and comprehension of viticulture adoption and the grape domestication process. The same methodology may also be successful when applied to other plant remains, and provide important information about the history of domesticated plants.
Fogarty, Laurel; Wakano, Joe Yuichiro; Feldman, Marcus W; Aoki, Kenichi
2017-03-01
The forces driving cultural accumulation in human populations, both modern and ancient, are hotly debated. Did genetic, demographic, or cognitive features of behaviorally modern humans (as opposed to, say, early modern humans or Neanderthals) allow culture to accumulate to its current, unprecedented levels of complexity? Theoretical explanations for patterns of accumulation often invoke demographic factors such as population size or density, whereas statistical analyses of variation in cultural complexity often point to the importance of environmental factors such as food stability, in determining cultural complexity. Here we use both an analytical model and an agent-based simulation model to show that a full understanding of the emergence of behavioral modernity, and the cultural evolution that has followed, depends on understanding and untangling the complex relationships among culture, genetically determined cognitive ability, and demographic history. For example, we show that a small but growing population could have a different number of cultural traits from a shrinking population with the same absolute number of individuals in some circumstances.
Courses in Modern Physics for Non-science Majors, Future Science Teachers, and Biology Students
NASA Astrophysics Data System (ADS)
Zollman, Dean
2001-03-01
For the past 15 years Kansas State University has offered a course in modern physics for students who are not majoring in physics. This course carries a prerequisite of one physics course so that the students have a basic introduction in classical topics. The majors of students range from liberal arts to engineering. Future secondary science teachers whose first area of teaching is not physics can use the course as part of their study of science. The course has evolved from a lecture format to one which is highly interactive and uses a combination of hands-on activities, tutorials and visualizations, particularly the Visual Quantum Mechanics materials. Another course encourages biology students to continue their physics learning beyond the introductory course. Modern Miracle Medical Machines introduces the basic physics which underlie diagnosis techniques such as MRI and PET and laser surgical techniques. Additional information is available at http://www.phys.ksu.edu/perg/
[Achievements and enlightenment of modern acupuncture therapy for stroke based on the neuroanatomy].
Chen, Li-Fang; Fang, Jian-Qiao; Chen, Lu-Ni; Wang, Chao
2014-04-01
Up to now, in the treatment of stroke patients by acupuncture therapy, three main representative achievements involving scalp acupuncture intervention, "Xing Nao Kai Qiao" (restoring consciousness and inducing resuscitation) acupuncture technique and nape acupuncture therapy have been got. Regarding their neurobiological mechanisms, the scalp acupuncture therapy is based on the functional localization of the cerebral cortex, "Xing Nao Kai Qiao" acupuncture therapy is closely related to nerve stem stimulation, and the nape acupuncture therapy is based on the nerve innervation of the regional neck-nape area in obtaining therapeutic effects. In fact, effects of these three acupuncture interventions are all closely associated with the modern neuroanatomy. In the treatment of post-stroke spastic paralysis, cognitive disorder and depression with acupuncture therapy, modern neuroanatomical knowledge should be one of the key theoretical basis and new therapeutic techniques should be explored and developed continuously.
Modern developments for ground-based monitoring of fire behavior and effects
Colin C. Hardy; Robert Kremens; Matthew B. Dickinson
2010-01-01
Advances in electronic technology over the last several decades have been staggering. The cost of electronics continues to decrease while system performance increases seemingly without limit. We have applied modern techniques in sensors, electronics and instrumentation to create a suite of ground based diagnostics that can be used in laboratory (~ 1 m2), field scale...
ERIC Educational Resources Information Center
Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca
2018-01-01
A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…
A Course in Heterogeneous Catalysis: Principles, Practice, and Modern Experimental Techniques.
ERIC Educational Resources Information Center
Wolf, Eduardo E.
1981-01-01
Outlines a multidisciplinary course which comprises fundamental, practical, and experimental aspects of heterogeneous catalysis. The course structure is a combination of lectures and demonstrations dealing with the use of spectroscopic techniques for surface analysis. (SK)
Conservation and Preservation of Archives.
ERIC Educational Resources Information Center
Kathpalia, Y. P.
1982-01-01
Presents concept of preventive conservation of archival records as a new science resulting from the use of modern techniques and chemicals. Various techniques for storage, proper environment, preventive de-acidification, fire prevention, restoration, and staff considerations are described. References are provided. (EJS)
Arroyo, Pedro; Pardío-López, Jeanette; Loria, Alvar; Fernández-García, Victoria
2010-01-01
The objective of this article is to provide information on cooking techniques used by two rural communities of Yucatán. We used a 24-hour recall method with 275 participants consuming 763 dishes. Dishes were classified according to cooking technique: 205 were lard-fried (27%), 169 oil-fried (22%), and 389 boiled/grilled (51%). The smaller more secluded community (San Rafael) consumed more fried dishes than the larger community (Uci) (54% versus 45%) and used more lard-frying than Uci (65% versus 46%). The more extensive use of lard in the smaller community appears to be due to fewer modernizing influences such as the availability and use of industrialized vegetable oils. Copyright © Taylor & Francis Group, LLC
Housman, L B; Bonchek, L; Lambert, L; Grunkemeier, G; Starr, A
1977-05-01
The continuing controversy between proponents of open and closed commissurotomy might be clarified by analysis of late follow-up with modern actuarial techniques that provide a true perspective of patient risk. We have used open mitral commissurotomy exclusively for 15 years in 100 patients. There was one operative death from pancreatitis and one late death from cancer; the actuarially projected survival rate (+/- the standard error) at 10 years is 97 per cent (+/- 2). Thirteen patients had preoperative emboli, 6 of whom were in sinus rhythm and 7 in atrial fibrillation. Two patients had postoperative emboli, both in sinus rhythm. The actuarial chance of remaining free of embolism at 10 years is 97 per cent (+/- 2). Sixteen patients required reoperation on the mitral valve for functional deterioration. The remaining survivors were in Class I or II when last seen. The actuarial chance of not requiring a reoperation after 5 years is 91 per cent (+/- 4) and at 10 years, 38 per cent(+/- 16). Results in different centers are difficult to compare for many reasons, but imprecise statistical methods further obscure such comparisons. The use of actuarial techniques may help to define the role of open mitral commissurotomy.
Queries over Unstructured Data: Probabilistic Methods to the Rescue
NASA Astrophysics Data System (ADS)
Sarawagi, Sunita
Unstructured data like emails, addresses, invoices, call transcripts, reviews, and press releases are now an integral part of any large enterprise. A challenge of modern business intelligence applications is analyzing and querying data seamlessly across structured and unstructured sources. This requires the development of automated techniques for extracting structured records from text sources and resolving entity mentions in data from various sources. The success of any automated method for extraction and integration depends on how effectively it unifies diverse clues in the unstructured source and in existing structured databases. We argue that statistical learning techniques like Conditional Random Fields (CRFs) provide a accurate, elegant and principled framework for tackling these tasks. Given the inherent noise in real-world sources, it is important to capture the uncertainty of the above operations via imprecise data models. CRFs provide a sound probability distribution over extractions but are not easy to represent and query in a relational framework. We present methods of approximating this distribution to query-friendly row and column uncertainty models. Finally, we present models for representing the uncertainty of de-duplication and algorithms for various Top-K count queries on imprecise duplicates.
Inference for the physical sciences
Jones, Nick S.; Maccarone, Thomas J.
2013-01-01
There is a disconnect between developments in modern data analysis and some parts of the physical sciences in which they could find ready use. This introduction, and this issue, provides resources to help experimental researchers access modern data analysis tools and exposure for analysts to extant challenges in physical science. We include a table of resources connecting statistical and physical disciplines and point to appropriate books, journals, videos and articles. We conclude by highlighting the relevance of each of the articles in the associated issue. PMID:23277613
Bioaerosol Sampling in Modern Agriculture: A Novel Approach for Emerging Pathogen Surveillance?
Anderson, Benjamin D.; Ma, Mengmeng; Xia, Yao; Wang, Tao; Shu, Bo; Lednicky, John A.; Ma, Mai-Juan; Lu, Jiahai; Gray, Gregory C.
2016-01-01
Background. Modern agricultural practices create environmental conditions conducive to the emergence of novel pathogens. Current surveillance efforts to assess the burden of emerging pathogens in animal production facilities in China are sparse. In Guangdong Province pig farms, we compared bioaerosol surveillance for influenza A virus to surveillance in oral pig secretions and environmental swab specimens. Methods. During the 2014 summer and fall/winter seasons, we used 3 sampling techniques to study 5 swine farms weekly for influenza A virus. Samples were molecularly tested for influenza A virus, and positive specimens were further characterized with culture. Risk factors for influenza A virus positivity for each sample type were assessed. Results. Seventy-one of 354 samples (20.1%) were positive for influenza A virus RNA by real-time reverse-transcription polymerase chain reaction analysis. Influenza A virus positivity in bioaerosol samples was a statistically significant predictor for influenza A virus positivity in pig oral secretion and environmental swab samples. Temperature of <20°C was a significant predictor of influenza A virus positivity in bioaerosol samples. Discussions. Climatic factors and routine animal husbandry practices may increase the risk of human exposure to aerosolized influenza A viruses in swine farms. Data suggest that bioaerosol sampling in pig barns may be a noninvasive and efficient means to conduct surveillance for novel influenza viruses. PMID:27190187
Kuo, Chun-Lin; Fukui, Hiromichi
2007-06-30
Disease diffusion patterns can provide clues for understanding geographical change. Fukushima, a rural prefecture in northeast Japan, was chosen for a case study of the late nineteenth century cholera epidemic that occurred in that country. Two volumes of Cholera Ryu-ko Kiji (Cholera Epidemic Report), published by the prefectural government in 1882 and 1895, provide valuable records for analyzing and modelling diffusion. Text descriptions and numerical evidence culled from the reports were incorporated into a temporal-spatial study framework using geographic information system (GIS) and geo-statistical techniques. Changes in diffusion patterns between 1882 and 1895 reflect improvements in the Fukushima transportation system and growth in social-economic networks. The data reveal different diffusion systems in separate regions in which residents of Fukushima and neighboring prefectures interacted. Our model also shows that an area in the prefecture's northern interior was dominated by a mix of diffusion processes (contagious and hierarchical), that the southern coastal region was affected by a contagious process, and that other infected areas experienced relocation diffusion. In addition to enhancing our understanding of epidemics, the spatial-temporal patterns of cholera diffusion offer opportunities for studying regional change in modern Japan. By highlighting the dynamics of regional reorganization, our findings can be used to better understand the formation of an urban hierarchy in late nineteenth century Japan.
Iyoke, Ca; Ezugwu, Fo; Lawani, Ol; Ugwu, Go; Ajah, Lo; Mba, Sg
2014-01-01
To describe the methods preferred for contraception, evaluate preferences and adherence to modern contraceptive methods, and determine the factors associated with contraceptive choices among tertiary students in South East Nigeria. A questionnaire-based cross-sectional study of sexual habits, knowledge of contraceptive methods, and patterns of contraceptive choices among a pooled sample of unmarried students from the three largest tertiary educational institutions in Enugu city, Nigeria was done. Statistical analysis involved descriptive and inferential statistics at the 95% level of confidence. A total of 313 unmarried students were studied (194 males; 119 females). Their mean age was 22.5±5.1 years. Over 98% of males and 85% of females made their contraceptive choices based on information from peers. Preferences for contraceptive methods among female students were 49.2% for traditional methods of contraception, 28% for modern methods, 10% for nonpharmacological agents, and 8% for off-label drugs. Adherence to modern contraceptives among female students was 35%. Among male students, the preference for the male condom was 45.2% and the adherence to condom use was 21.7%. Multivariate analysis showed that receiving information from health personnel/media/workshops (odds ratio 9.54, 95% confidence interval 3.5-26.3), health science-related course of study (odds ratio 3.5, 95% confidence interval 1.3-9.6), and previous sexual exposure prior to university admission (odds ratio 3.48, 95% confidence interval 1.5-8.0) all increased the likelihood of adherence to modern contraceptive methods. An overwhelming reliance on peers for contraceptive information in the context of poor knowledge of modern methods of contraception among young people could have contributed to the low preferences and adherence to modern contraceptive methods among students in tertiary educational institutions. Programs to reduce risky sexual behavior among these students may need to focus on increasing the content and adequacy of contraceptive information held by people through regular health worker-led, on-campus workshops.
An integrated study of earth resources in the state of California using remote sensing techniques
NASA Technical Reports Server (NTRS)
1973-01-01
University of California investigations to determine the usefulness of modern remote sensing techniques have concentrated on the water resources of the state. The studies consider in detail the supply, demand, and impact relationships.
The iLappSurgery taTME app: a modern adjunct to the teaching of surgical techniques.
Atallah, S; Brady, R R W
2016-09-01
Application-based technology has emerged as a method of modern information communication, and this has been applied towards surgical training and education. It allows surgeons the ability to obtain portable and instant access to information that is otherwise difficult to deliver. The iLappSurgery Foundation has recently launched the transanal total mesorectal excision educational application (taTME app) which provides a useful adjunct, especially for surgeons interested in mastery of the taTME technique and its principles. The article provides a detailed review of the application, which has achieved a large user-base since its debut in June, 2016.
Advances in Patellofemoral Arthroplasty.
Strickland, Sabrina M; Bird, Mackenzie L; Christ, Alexander B
2018-06-01
To describe current indications, implants, economic benefits, comparison to TKA, and functional and patient-reported outcomes of patellofemoral arthroplasty. Modern onlay implants and improved patient selection have allowed for recent improvements in short- and long-term outcomes after patellofemoral joint replacement surgery. Patellofemoral arthroplasty has become an increasingly utilized technique for the successful treatment of isolated patellofemoral arthritis. Advances in patient selection, implant design, and surgical technique have resulted in improved performance and longevity of these implants. Although short- and mid-term data for modern patellofemoral arthroplasties appear promising, further long-term clinical studies are needed to evaluate how new designs and technologies will affect patient outcomes and long-term implant performance.
Insecticide ADME for support of early-phase discovery: combining classical and modern techniques.
David, Michael D
2017-04-01
The two factors that determine an insecticide's potency are its binding to a target site (intrinsic activity) and the ability of its active form to reach the target site (bioavailability). Bioavailability is dictated by the compound's stability and transport kinetics, which are determined by both physical and biochemical characteristics. At BASF Global Insecticide Research, we characterize bioavailability in early research with an ADME (Absorption, Distribution, Metabolism and Excretion) approach, combining classical and modern techniques. For biochemical assessment of metabolism, we purify native insect enzymes using classical techniques, and recombinantly express individual insect enzymes that are known to be relevant in insecticide metabolism and resistance. For analytical characterization of an experimental insecticide and its metabolites, we conduct classical radiotracer translocation studies when a radiolabel is available. In discovery, where typically no radiolabel has been synthesized, we utilize modern high-resolution mass spectrometry to probe complex systems for the test compounds and its metabolites. By using these combined approaches, we can rapidly compare the ADME properties of sets of new experimental insecticides and aid in the design of structures with an improved potential to advance in the research pipeline. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Pflaumann, Uwe; Duprat, Josette; Pujol, Claude; Labeyrie, Laurent D.
1996-02-01
We present a data set of 738 planktonic foraminiferal species counts from sediment surface samples of the eastern North Atlantic and the South Atlantic between 87°N and 40°S, 35°E and 60°W including published Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) data. These species counts are linked to Levitus's [1982] modern water temperature data for the four caloric seasons, four depth ranges (0, 30, 50, and 75 m), and the combined means of those depth ranges. The relation between planktonic foraminiferal assemblages and sea surface temperature (SST) data is estimated using the newly developed SIMMAX technique, which is an acronym for a modern analog technique (MAT) with a similarity index, based on (1) the scalar product of the normalized faunal percentages and (2) a weighting procedure of the modern analog's SSTs according to the inverse geographical distances of the most similar samples. Compared to the classical CLIMAP transfer technique and conventional MAT techniques, SIMMAX provides a more confident reconstruction of paleo-SSTs (correlation coefficient is 0.994 for the caloric winter and 0.993 for caloric summer). The standard deviation of the residuals is 0.90°C for caloric winter and 0.96°C for caloric summer at 0-m water depth. The SST estimates reach optimum stability (standard deviation of the residuals is 0.88°C) at the average 0- to 75-m water depth. Our extensive database provides SST estimates over a range of -1.4 to 27.2°C for caloric winter and 0.4 to 28.6°C for caloric summer, allowing SST estimates which are especially valuable for the high-latitude Atlantic during glacial times. An electronic supplement of this material may be obtained on adiskette or Anonymous FTP from KOSMOS.AGU.ORG. (LOGIN toAGU's FTP account using ANONYMOUS as the username and GUESTas the password. Go to the right directory by typing CD APPEND. TypeLS to see what files are available. Type GET and the name of the file toget it. Finally type EXIT to leave the system.) (Paper 95PA01743,SIMMAX: A modern analog technique to deduce Atlantic sea surfacetemperatures from planktonic foraminifera in deep-sea sediments, UwePflaumann, Josette Duprat, Claude Pujol, and Laurent D. Labeyrie).Diskette may be ordered from American Geophysical Union, 2000Florida Avenue, N.W., Washington, DC 20009; Payment mustaccompany order.
Chemistry Is Dead. Long Live Chemistry!
Lavis, Luke D
2017-10-03
Chemistry, once king of fluorescence microscopy, was usurped by the field of fluorescent proteins. The increased demands of modern microscopy techniques on the "photon budget" require better and brighter fluorophores, causing a renewed interest in synthetic dyes. Here, we review the recent advances in biochemistry, protein engineering, and organic synthesis that have allowed a triumphant return of chemical fluorophores to modern biological imaging.
[Watsu: a modern method in physiotherapy, body regeneration, and sports].
Weber-Nowakowska, Katarzyna; Gebska, Magdalena; Zyzniewska-Banaszak, Ewelina
2013-01-01
Progress in existing methods of physiotherapy and body regeneration and introduction of new methods has made it possible to precisely select the techniques according to patient needs. The modern therapist is capable of improving the physical and mental condition of the patient. Watsu helps the therapist eliminate symptoms from the locomotor system and reach the psychic sphere at the same time.
Graphic Poetry: How to Help Students Get the Most out of Pictures
ERIC Educational Resources Information Center
Chiang, River Ya-ling
2013-01-01
This paper attempts to give an account of some innovative work in paintings and modern poetry and to show how modern poets, such as Jane Flanders and Anne Sexton, the two American poets in particular, express and develop radically new conventions for their respective arts. Also elaborated are how such changes in artistic techniques are related to…
Analytics and Action in Afghanistan
2010-09-01
rests on rational technology , and ultimately on scientific knowledge. No country could be modern without being eco- nomically advanced or...backwardness to enlight - ened modernity. Underdeveloped countries had failed to progress to what Max Weber called rational legalism because of the grip...Douglas Pike, Viet Cong: The Organization and Techniques of the National Liberation Front of South Vietnam (Boston: Massachusetts Institute of Technology
Europe Report, Science and Technology
1986-09-30
to certain basic products of the food industry such as beer, vinegar , 51 spirits, starches, etc. It is also assumed that modern biotechnologies...Czechoslovak food production. This is also the objective of innovative and modernizing programs in the fermented food sectors. The program for the...cattle and improves fodder utilization, assuming balanced doses of fodder. The development of fermentation techniques of production will occur within
Performance points. The reform club.
Edwards, Nick
2004-03-18
The improvement Partnership for Hospitals programme is the vanguard of Modernization Agency work. It is based on statistical process control to eliminate variations in performance, especially in elective service. All starred trusts will join IPH by next April.
ERIC Educational Resources Information Center
Ferguson, Albert S.
Experiences with various modern management techniques and practices in selected small, private church-related colleges were studied. For comparative purposes, practices in public colleges and universities were also assessed. Management techniques used in small companies were identified through review of the literature and the management seminars…
Practical Problems in the Cement Industry Solved by Modern Research Techniques
ERIC Educational Resources Information Center
Daugherty, Kenneth E.; Robertson, Les D.
1972-01-01
Practical chemical problems in the cement industry are being solved by such techniques as infrared spectroscopy, gas chromatography-mass spectrometry, X-ray diffraction, atomic absorption and arc spectroscopy, thermally evolved gas analysis, Mossbauer spectroscopy, transmission and scanning electron microscopy. (CP)
Welford, Mark R; Bossak, Brian H
2009-12-22
Recent studies have noted myriad qualitative and quantitative inconsistencies between the medieval Black Death (and subsequent "plagues") and modern empirical Y. pestis plague data, most of which is derived from the Indian and Chinese plague outbreaks of A.D. 1900+/-15 years. Previous works have noted apparent differences in seasonal mortality peaks during Black Death outbreaks versus peaks of bubonic and pneumonic plagues attributed to Y. pestis infection, but have not provided spatiotemporal statistical support. Our objective here was to validate individual observations of this seasonal discrepancy in peak mortality between historical epidemics and modern empirical data. We compiled and aggregated multiple daily, weekly and monthly datasets of both Y. pestis plague epidemics and suspected Black Death epidemics to compare seasonal differences in mortality peaks at a monthly resolution. Statistical and time series analyses of the epidemic data indicate that a seasonal inversion in peak mortality does exist between known Y. pestis plague and suspected Black Death epidemics. We provide possible explanations for this seasonal inversion. These results add further evidence of inconsistency between historical plagues, including the Black Death, and our current understanding of Y. pestis-variant disease. We expect that the line of inquiry into the disputed cause of the greatest recorded epidemic will continue to intensify. Given the rapid pace of environmental change in the modern world, it is crucial that we understand past lethal outbreaks as fully as possible in order to prepare for future deadly pandemics.
Oliver, Kelly; Manton, David John
2015-01-01
Effective behavior management guides children through the complex social context of dentistry utilizing techniques based on a current understanding of the social, emotional, and cognitive development of children. Behavior management techniques facilitate effective communication and establish social and behavioral guidelines for the dental environment. Contemporary parenting styles, expectations, and attitudes of modern parents and society have influenced the use of behavior management techniques with a prevailing emphasis on communicative techniques and pharmacological management over aversive techniques.
[Discussion on the cultural loss and return of modern acupuncture].
Liu, Bing; Zhao, Jing-sheng; Gao, Shu-zhong
2009-08-01
The philosophical ontology analysis was used in this study to explore the self-factors related to the cultural loss of modern acupuncture, and to establish the theoretical constructs and the clinical model for the cultural return. It is indicated that the most important factors related to the cultural loss of modern acupuncture are the separation of technical characteristics and cultural connotations and the diversion of modern techniques away from classical acupuncture. An effective way of the cultural return is to build a harmonious theoretical and clinical model to develop acupuncture. Based on the foundation of acupuncture from its own culture roots, the traditional sense and cultural values should be enhanced to facilitate the cultural return of acupuncture in theory and clinical practice.
Survey of statistical techniques used in validation studies of air pollution prediction models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bornstein, R D; Anderson, S F
1979-03-01
Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.
The Energetic Cost of Walking: A Comparison of Predictive Methods
Kramer, Patricia Ann; Sylvester, Adam D.
2011-01-01
Background The energy that animals devote to locomotion has been of intense interest to biologists for decades and two basic methodologies have emerged to predict locomotor energy expenditure: those based on metabolic and those based on mechanical energy. Metabolic energy approaches share the perspective that prediction of locomotor energy expenditure should be based on statistically significant proxies of metabolic function, while mechanical energy approaches, which derive from many different perspectives, focus on quantifying the energy of movement. Some controversy exists as to which mechanical perspective is “best”, but from first principles all mechanical methods should be equivalent if the inputs to the simulation are of similar quality. Our goals in this paper are 1) to establish the degree to which the various methods of calculating mechanical energy are correlated, and 2) to investigate to what degree the prediction methods explain the variation in energy expenditure. Methodology/Principal Findings We use modern humans as the model organism in this experiment because their data are readily attainable, but the methodology is appropriate for use in other species. Volumetric oxygen consumption and kinematic and kinetic data were collected on 8 adults while walking at their self-selected slow, normal and fast velocities. Using hierarchical statistical modeling via ordinary least squares and maximum likelihood techniques, the predictive ability of several metabolic and mechanical approaches were assessed. We found that all approaches are correlated and that the mechanical approaches explain similar amounts of the variation in metabolic energy expenditure. Most methods predict the variation within an individual well, but are poor at accounting for variation between individuals. Conclusion Our results indicate that the choice of predictive method is dependent on the question(s) of interest and the data available for use as inputs. Although we used modern humans as our model organism, these results can be extended to other species. PMID:21731693
Genome data from a sixteenth century pig illuminate modern breed relationships
Ramírez, O; Burgos-Paz, W; Casas, E; Ballester, M; Bianco, E; Olalde, I; Santpere, G; Novella, V; Gut, M; Lalueza-Fox, C; Saña, M; Pérez-Enciso, M
2015-01-01
Ancient DNA (aDNA) provides direct evidence of historical events that have modeled the genome of modern individuals. In livestock, resolving the differences between the effects of initial domestication and of subsequent modern breeding is not straight forward without aDNA data. Here, we have obtained shotgun genome sequence data from a sixteenth century pig from Northeastern Spain (Montsoriu castle), the ancient pig was obtained from an extremely well-preserved and diverse assemblage. In addition, we provide the sequence of three new modern genomes from an Iberian pig, Spanish wild boar and a Guatemalan Creole pig. Comparison with both mitochondrial and autosomal genome data shows that the ancient pig is closely related to extant Iberian pigs and to European wild boar. Although the ancient sample was clearly domestic, admixture with wild boar also occurred, according to the D-statistics. The close relationship between Iberian, European wild boar and the ancient pig confirms that Asian introgression in modern Iberian pigs has not existed or has been negligible. In contrast, the Guatemalan Creole pig clusters apart from the Iberian pig genome, likely due to introgression from international breeds. PMID:25204303
Estimating past precipitation and temperature from fossil ostracodes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, A.J.; Forester, R.M.
1994-12-31
The fossil records of certain aquatic organisms provide a way of obtaining meaningful estimates of past temperature and precipitation. These estimates of past environmental conditions are derived from multivariate statistical methods that are in turn based on the modern biogeographic distributions and environmental tolerances of the biota of interest. These estimates are helpful in conducting slimate studies as part of the Yucca Mountain site characterization. Ostracodes are microscopic crustaceans that produce bivalved calcite shells which are easily fossilized in the sediments of the lakes and wetlands in which the animals lived. The modern biogeographic distribution and environmental conditions of livingmore » ostracodes are the basis for the interpretation of the past environmental conditions of the fossil ostracodes. The major assumption in this method of interpretation is that the environmental tolerances of ostracodes have not changed substantially over thousands of years. Two methods using these modern analogs to determine past environmental conditions are the modern analog method and the range method. The range method also considers the information provided by fossil ostracode assemblages that have no modern analog in today`s world.« less
Flexible multibody simulation of automotive systems with non-modal model reduction techniques
NASA Astrophysics Data System (ADS)
Shiiba, Taichi; Fehr, Jörg; Eberhard, Peter
2012-12-01
The stiffness of the body structure of an automobile has a strong relationship with its noise, vibration, and harshness (NVH) characteristics. In this paper, the effect of the stiffness of the body structure upon ride quality is discussed with flexible multibody dynamics. In flexible multibody simulation, the local elastic deformation of the vehicle has been described traditionally with modal shape functions. Recently, linear model reduction techniques from system dynamics and mathematics came into the focus to find more sophisticated elastic shape functions. In this work, the NVH-relevant states of a racing kart are simulated, whereas the elastic shape functions are calculated with modern model reduction techniques like moment matching by projection on Krylov-subspaces, singular value decomposition-based reduction techniques, and combinations of those. The whole elastic multibody vehicle model consisting of tyres, steering, axle, etc. is considered, and an excitation with a vibration characteristics in a wide frequency range is evaluated in this paper. The accuracy and the calculation performance of those modern model reduction techniques is investigated including a comparison of the modal reduction approach.
REINVENTING PERSONAL EXPOSURE TO PARTICULATE MATTER
Recent epidemiologic studies of modern air pollution show statistically significant relationships between fluctuations of daily non-trauma mortality and fluctuations of daily ambient particulate matter (PM) levels at low concentrations. A review of historic smoke-fog (smog)episo...
Addressing Angular Single-Event Effects in the Estimation of On-Orbit Error Rates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, David S.; Swift, Gary M.; Wirthlin, Michael J.
2015-12-01
Our study describes complications introduced by angular direct ionization events on space error rate predictions. In particular, prevalence of multiple-cell upsets and a breakdown in the application of effective linear energy transfer in modern-scale devices can skew error rates approximated from currently available estimation models. Moreover, this paper highlights the importance of angular testing and proposes a methodology to extend existing error estimation tools to properly consider angular strikes in modern-scale devices. Finally, these techniques are illustrated with test data provided from a modern 28 nm SRAM-based device.
Merchandising Techniques and Libraries.
ERIC Educational Resources Information Center
Green, Sylvie A.
1981-01-01
Proposes that libraries employ modern booksellers' merchandising techniques to improve circulation of library materials. Using displays in various ways, the methods and reasons for weeding out books, replacing worn book jackets, and selecting new books are discussed. Suggestions for learning how to market and 11 references are provided. (RBF)
Dance Critique as Signature Pedagogy
ERIC Educational Resources Information Center
Kearns, Lauren
2017-01-01
The curriculum of preprofessional university degree programs in dance typically comprise four components: theory and history, dance technique, creative process, and performance. This article focuses on critique in the modern dance technique and choreography components of the dance curriculum. Bachelor of Fine Arts programs utilize critique as a…
Quantitative proteomics in the field of microbiology.
Otto, Andreas; Becher, Dörte; Schmidt, Frank
2014-03-01
Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Evaluation of virtual environment as a form of interactive resuscitation exam
NASA Astrophysics Data System (ADS)
Leszczyński, Piotr; Charuta, Anna; Kołodziejczak, Barbara; Roszak, Magdalena
2017-10-01
There is scientific evidence confirming the effectiveness of e-learning within resuscitation, however, there is not enough research on modern examination techniques within the scope. The aim of the pilot research is to compare the exam results in the field of Advanced Life Support in a traditional (paper) and interactive (computer) form as well as to evaluate satisfaction of the participants. A survey was conducted which meant to evaluate satisfaction of exam participants. Statistical analysis of the collected data was conducted at a significance level of α = 0.05 using STATISTICS v. 12. Final results of the traditional exam (67.5% ± 15.8%) differed significantly (p < 0.001) from the results of the interactive exam (53.3% ± 13.7%). However, comparing the number of students who did not pass the exam (passing point at 51%), no significant differences (p = 0.13) were observed between the two types exams. The feedback accuracy as well as the presence of well-prepared interactive questions could influence the evaluation of satisfaction of taking part in the electronic test. Significant differences between the results of a traditional test and the one supported by Computer Based Learning system showed the possibility of achieving a more detailed competence verification in the field of resuscitation thanks to interactive solutions.
Replica analysis of overfitting in regression models for time-to-event data
NASA Astrophysics Data System (ADS)
Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.
2017-09-01
Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.
ERIC Educational Resources Information Center
Kamalova, Lera A.; Koletvinova, Natal'ya D.
2016-01-01
This article is aimed to study the problems of reading and improve reading culture of students-bachelors of elementary education in modern high institutions and development of the most effective methods and techniques for improving of reading culture of students in the study of Humanities disciplines. The leading method to the study of this…
Hicks, Michael B; Regalado, Erik L; Tan, Feng; Gong, Xiaoyi; Welch, Christopher J
2016-01-05
Supercritical fluid chromatography (SFC) has long been a preferred method for enantiopurity analysis in support of pharmaceutical discovery and development, but implementation of the technique in regulated GMP laboratories has been somewhat slow, owing to limitations in instrument sensitivity, reproducibility, accuracy and robustness. In recent years, commercialization of next generation analytical SFC instrumentation has addressed previous shortcomings, making the technique better suited for GMP analysis. In this study we investigate the use of modern SFC for enantiopurity analysis of several pharmaceutical intermediates and compare the results with the conventional HPLC approaches historically used for analysis in a GMP setting. The findings clearly illustrate that modern SFC now exhibits improved precision, reproducibility, accuracy and robustness; also providing superior resolution and peak capacity compared to HPLC. Based on these findings, the use of modern chiral SFC is recommended for GMP studies of stereochemistry in pharmaceutical development and manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.
Travis, F; Olson, T; Egenes, T; Gupta, H K
2001-07-01
This study tested the prediction that reading Vedic Sanskrit texts, without knowledge of their meaning, produces a distinct physiological state. We measured EEG, breath rate, heart rate, and skin conductance during: (1) 15-min Transcendental Meditation (TM) practice; (2) 15-min reading verses of the Bhagavad Gita in Sanskrit; and (3) 15-min reading the same verses translated in German, Spanish, or French. The two reading conditions were randomly counterbalanced, and subjects filled out experience forms between each block to reduce carryover effects. Skin conductance levels significantly decreased during both reading Sanskrit and TM practice, and increased slightly during reading a modern language. Alpha power and coherence were significantly higher when reading Sanskrit and during TM practice, compared to reading modern languages. Similar physiological patterns when reading Sanskrit and during practice of the TM technique suggests that the state gained during TM practice may be integrated with active mental processes by reading Sanskrit.
[Construction of multiple drug release system based on components of traditional Chinese medicine].
Liu, Dan; Jia, Xiaobin; Yu, Danhong; Zhang, Zhenhai; Sun, E
2012-08-01
With the development of the modernization drive of traditional Chinese medicine (TCM) preparations, new-type TCM dosage forms research have become a hot spot in the field. Because of complexity of TCM components as well as uncertainty of material base, there is still not a scientific system for modern TCM dosage forms so far. Modern TCM preparations inevitably take the nature of the multi-component and the general function characteristics of multi-link and multi-target into account. The author suggests building a multiple drug release system for TCM using diverse preparation techniques and drug release methods at levels on the basis the nature and function characteristics of TCM components. This essay expounds elaborates the ideas to build the multiple traditional Chinese medicine release system, theoretical basis, preparation techniques and assessment system, current problems and solutions, in order to build a multiple TCM release system with a view of enhancing the bioavailability of TCM components and provide a new form for TCM preparations.
Biometric Analysis – A Reliable Indicator for Diagnosing Taurodontism using Panoramic Radiographs
Hegde, Veda; Anegundi, Rajesh Trayambhak; Pravinchandra, K.R.
2013-01-01
Background: Taurodontism is a clinical entity with a morpho–anatomical change in the shape of the tooth, which was thought to be absent in modern man. Taurodontism is mostly observed as an isolated trait or a component of a syndrome. Various techniques have been devised to diagnose taurodontism. Aim: The aim of this study was to analyze whether a biometric analysis was useful in diagnosing taurodontism, in radiographs which appeared to be normal on cursory observations. Setting and Design: This study was carried out in our institution by using radiographs which were taken for routine procedures. Material and Methods: In this retrospective study, panoramic radiographs were obtained from dental records of children who were aged between 9–14 years, who did not have any abnormality on cursory observations. Biometric analyses were carried out on permanent mandibular first molar(s) by using a novel biometric method. The values were tabulated and analysed. Statistics: Fischer exact probability test, Chi square test and Chi-square test with Yates correction were used for statistical analysis of the data. Results: Cursory observation did not yield us any case of taurodontism. In contrast, the biometric analysis yielded us a statistically significant number of cases of taurodontism. However, there was no statistically significant difference in the number of cases with taurodontism, which was obtained between the genders and the age group which was considered. Conclusion: Thus, taurodontism was diagnosed on a biometric analysis, which was otherwise missed on a cursory observation. It is therefore necessary from the clinical point of view, to diagnose even the mildest form of taurodontism by using metric analysis rather than just relying on a visual radiographic assessment, as its occurrence has many clinical implications and a diagnostic importance. PMID:24086912
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph
Milker, Yvonne; Weinkauf, Manuel F G; Titschack, Jürgen; Freiwald, Andre; Krüger, Stefan; Jorissen, Frans J; Schmiedl, Gerhard
2017-01-01
We present paleo-water depth reconstructions for the Pefka E section deposited on the island of Rhodes (Greece) during the early Pleistocene. For these reconstructions, a transfer function (TF) using modern benthic foraminifera surface samples from the Adriatic and Western Mediterranean Seas has been developed. The TF model gives an overall predictive accuracy of ~50 m over a water depth range of ~1200 m. Two separate TF models for shallower and deeper water depth ranges indicate a good predictive accuracy of 9 m for shallower water depths (0-200 m) but far less accuracy of 130 m for deeper water depths (200-1200 m) due to uneven sampling along the water depth gradient. To test the robustness of the TF, we randomly selected modern samples to develop random TFs, showing that the model is robust for water depths between 20 and 850 m while greater water depths are underestimated. We applied the TF to the Pefka E fossil data set. The goodness-of-fit statistics showed that most fossil samples have a poor to extremely poor fit to water depth. We interpret this as a consequence of a lack of modern analogues for the fossil samples and removed all samples with extremely poor fit. To test the robustness and significance of the reconstructions, we compared them to reconstructions from an alternative TF model based on the modern analogue technique and applied the randomization TF test. We found our estimates to be robust and significant at the 95% confidence level, but we also observed that our estimates are strongly overprinted by orbital, precession-driven changes in paleo-productivity and corrected our estimates by filtering out the precession-related component. We compared our corrected record to reconstructions based on a modified plankton/benthos (P/B) ratio, excluding infaunal species, and to stable oxygen isotope data from the same section, as well as to paleo-water depth estimates for the Lindos Bay Formation of other sediment sections of Rhodes. These comparisons indicate that our orbital-corrected reconstructions are reasonable and reflect major tectonic movements of Rhodes during the early Pleistocene.
Weinkauf, Manuel F. G.; Titschack, Jürgen; Freiwald, Andre; Krüger, Stefan; Jorissen, Frans J.; Schmiedl, Gerhard
2017-01-01
We present paleo-water depth reconstructions for the Pefka E section deposited on the island of Rhodes (Greece) during the early Pleistocene. For these reconstructions, a transfer function (TF) using modern benthic foraminifera surface samples from the Adriatic and Western Mediterranean Seas has been developed. The TF model gives an overall predictive accuracy of ~50 m over a water depth range of ~1200 m. Two separate TF models for shallower and deeper water depth ranges indicate a good predictive accuracy of 9 m for shallower water depths (0–200 m) but far less accuracy of 130 m for deeper water depths (200–1200 m) due to uneven sampling along the water depth gradient. To test the robustness of the TF, we randomly selected modern samples to develop random TFs, showing that the model is robust for water depths between 20 and 850 m while greater water depths are underestimated. We applied the TF to the Pefka E fossil data set. The goodness-of-fit statistics showed that most fossil samples have a poor to extremely poor fit to water depth. We interpret this as a consequence of a lack of modern analogues for the fossil samples and removed all samples with extremely poor fit. To test the robustness and significance of the reconstructions, we compared them to reconstructions from an alternative TF model based on the modern analogue technique and applied the randomization TF test. We found our estimates to be robust and significant at the 95% confidence level, but we also observed that our estimates are strongly overprinted by orbital, precession-driven changes in paleo-productivity and corrected our estimates by filtering out the precession-related component. We compared our corrected record to reconstructions based on a modified plankton/benthos (P/B) ratio, excluding infaunal species, and to stable oxygen isotope data from the same section, as well as to paleo-water depth estimates for the Lindos Bay Formation of other sediment sections of Rhodes. These comparisons indicate that our orbital-corrected reconstructions are reasonable and reflect major tectonic movements of Rhodes during the early Pleistocene. PMID:29166653
[Combined burn trauma in the array of modern civilian and combat burns].
Ivchenko, E V; Borisov, D N; Golota, A S; Krassiĭ, A B; Rusev, I T
2015-02-01
The current article positions the combined burn and non-burn injuries in the general array of civilian and combat burns. For that purpose the official state statistics and scientific medical publications, domestic as well as foreign, have been analyzed. It has been shown that in peace time the combined burn/trauma injuries are infrequent. But the same type of injury becomes routine especially among the civilian population in the conditions of the modern so called "hybrid war". And the medical service should be prepared for it.
Cicero, Raúl; Criales, José Luis; Cardoso, Manuel
2009-01-01
The impressive development of computed tomography (CT) techniques such as the three dimensional helical CT produces a spatial image of the thoracic skull. At the beginning of the 16th century Leonardo da Vinci drew with great precision the thorax oseum. These drawings show an outstanding similarity with the images obtained by three dimensional helical CT. The cumbersome task of the Renaissance genius is a prime example of the careful study of human anatomy. Modern imaging techniques require perfect anatomic knowledge of the human body in order to generate exact interpretations of images. Leonardo's example is alive for anybody devoted to modern imaging studies.
Diffraction scattering computed tomography: a window into the structures of complex nanomaterials
Birkbak, M. E.; Leemreize, H.; Frølich, S.; Stock, S. R.
2015-01-01
Modern functional nanomaterials and devices are increasingly composed of multiple phases arranged in three dimensions over several length scales. Therefore there is a pressing demand for improved methods for structural characterization of such complex materials. An excellent emerging technique that addresses this problem is diffraction/scattering computed tomography (DSCT). DSCT combines the merits of diffraction and/or small angle scattering with computed tomography to allow imaging the interior of materials based on the diffraction or small angle scattering signals. This allows, e.g., one to distinguish the distributions of polymorphs in complex mixtures. Here we review this technique and give examples of how it can shed light on modern nanoscale materials. PMID:26505175
Evaluation of maxillary growth: is there any difference using relief incision during palatoplasty?
Maluf, Ivan; Doro, Ubiratan; Fuchs, Taíse; dos Santos, Diego Esteves; dos Santos Sacomam, Franserg; da Silva Freitas, Renato; Roca, Guilherme Berto
2014-05-01
Scar retraction due to exposed bone in palatoplasty is the leading cause of constricted maxilla. Modern techniques have focused on minimizing the effects of scarring by reducing the exposure of the bone area. The objective of the study was to compare the palatal mucoperiosteal detachment with minimal lateral incision, followed by their synthesis, with the maintenance of lateral areas for relaxation (similar to the von Langenbeck technique) and evaluate the transversal development of the maxilla. A prospective, randomized study was conducted, in which the molding of the dental arch of 14 pigs in 2 stages (at 1 month and 5 months) was performed. The pigs were divided into 3 groups: group 1 underwent lateral incision of the palate for mucoperiosteal detachment and maintenance of bone exposure; group 2 underwent mucoperiosteal palatal detachment with lateral access and no bone exposure; and group 3, the control animals, did not undergo any surgical procedures. Measurements of the dental arches were compared between the groups to assess differences in the development of the maxillary transverse diameter. There were no animals lost during the study. Group 1 showed greater growth restriction of the transverse diameter of the maxilla (36%) when compared with groups 2 (56%) and 3 (59%). Groups 2 and 3 showed similar transverse maxillary development, with no statistical difference. The technique of mucoperiosteal detachment without lateral relief incision has the advantage of reducing future morbidity of a constricted maxilla. This study demonstrated that the technique described can reduce rates of maxillary underdevelopment, a significant complication inherent in the procedure for palatoplasty. The lateral incisions reduce maxillary growth by approximately 20% as compared with this technique. Level II of evidence.
Digital pre-compensation techniques enabling high-capacity bandwidth variable transponders
NASA Astrophysics Data System (ADS)
Napoli, Antonio; Berenguer, Pablo Wilke; Rahman, Talha; Khanna, Ginni; Mezghanni, Mahdi M.; Gardian, Lennart; Riccardi, Emilio; Piat, Anna Chiadò; Calabrò, Stefano; Dris, Stefanos; Richter, André; Fischer, Johannes Karl; Sommerkorn-Krombholz, Bernd; Spinnler, Bernhard
2018-02-01
Digital pre-compensation techniques are among the enablers for cost-efficient high-capacity transponders. In this paper we describe various methods to mitigate the impairments introduced by state-of-the-art components within modern optical transceivers. Numerical and experimental results validate their performance and benefits.
Iontophoresis and Flame Photometry: A Hybrid Interdisciplinary Experiment
ERIC Educational Resources Information Center
Sharp, Duncan; Cottam, Linzi; Bradley, Sarah; Brannigan, Jeanie; Davis, James
2010-01-01
The combination of reverse iontophoresis and flame photometry provides an engaging analytical experiment that gives first-year undergraduate students a flavor of modern drug delivery and analyte extraction techniques while reinforcing core analytical concepts. The experiment provides a highly visual demonstration of the iontophoresis technique and…
Three Contributions of a Spiritual Perspective to Counseling, Psychotherapy, and Behavior Change.
ERIC Educational Resources Information Center
Bergin, Allen E.
1988-01-01
Describes ways in which a spiritual approach can contribute to the modern applied science of behavior change. Divides approach into three areas: conception of human nature, moral frame of reference, and set of techniques. Discusses and demonstrates the transitional person technique. (Author/BH)
ARE MALES MORE SUSCEPTIBLE TO AMBIENT PM THAN FEMALES?
Recent epidemiologic studies of modern air pollution show statistically significant relationships between fluctuations of daily non-trauma mortality and fluctuations of daily ambient particulate matter (PM) levels at low concentrations. A review of historic smoke-fog (smog)episo...
Youth Alienation: Implications for Administrators.
ERIC Educational Resources Information Center
Wynne, Edward A.
1989-01-01
Charts modern phenomena (technology, urbanization, affluence, large institutions, mass media, and others) that affect human interactions and teach certain attitudes. Provides supporting statistics to show increases in youth suicide, illegitimate births, delinquency, substance abuse, and homicide. Outlines desirable school changes producing modest…
From middens to modern estuaries, oyster shells sequester source-specific nitrogen
NASA Astrophysics Data System (ADS)
Darrow, Elizabeth S.; Carmichael, Ruth H.; Andrus, C. Fred T.; Jackson, H. Edwin
2017-04-01
Oysters (Crassostrea virginica) were an important food resource for native peoples of the northern Gulf of Mexico, who deposited waste shells in middens. Nitrogen (N) stable isotopes (δ15N) in bivalve shells have been used as modern proxies for estuarine N sources because they approximate δ15N in suspended particulate matter. We tested the use of midden shell δ15N as a proxy for ancient estuarine N sources. We hypothesized that isotopic signatures in ancient shells from coastal Mississippi would differ from modern shells due to increased anthropogenic N sources, such as wastewater, through time. We decalcified shells using an acidification technique previously developed for modern bivalves, but modified to determine δ15N, δ13C, %N, and % organic C of these low-N, high-C specimens. The modified method resulted in the greatest percentage of usable data from midden shells. Our results showed that oyster shell δ15N did not significantly differ between ancient (500-2100 years old) and modern oysters from the same locations where the sites had undergone relatively little land-use change. δ15N values in modern shells, however, were positively correlated with water column nitrate concentrations associated with urbanization. When N content and total shell mass were combined, we estimated that middens sequestered 410-39,000 kg of relic N, buried at a rate of up to 5 kg N m-2 yr-1. This study provides a relatively simple technique to assess baseline conditions in ecosystems over long time scales by demonstrating that midden shells can be an indicator of pre-historic N source to estuaries and are a potentially significant but previously uncharacterized estuarine N sink.
Knight, Andrew; Watson, Katherine D.
2017-01-01
Simple Summary The identity of Jack the Ripper remains one of the greatest unsolved crime mysteries in history. Jack was notorious both for the brutality of his murders and also for his habit of stealing organs from his victims. His speed and skill in doing so, in conditions of poor light and haste, fueled theories he was a surgeon. However, re-examination of a mortuary sketch from one of his victims has revealed several key aspects that strongly suggest he had no professional surgical training. Instead, the technique used was more consistent with that of a slaughterhouse worker. There were many small-scale slaughterhouses in East London in the 1880s, within which conditions were harsh for animals and workers alike. The brutalizing effects of such work only add to concerns highlighted by modern research that those who commit violence on animals are more likely to target people. Modern slaughterhouses are more humane in some ways but more desensitizing in others, and sociological research has indicated that communities with slaughterhouses are more likely to experience the most violent of crimes. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. Abstract Hundreds of theories exist concerning the identity of “Jack the Ripper”. His propensity for anatomical dissection with a knife—and in particular the rapid location and removal of specific organs—led some to speculate that he must have been surgically trained. However, re-examination of a mortuary sketch of one of his victims has revealed several aspects of incisional technique highly inconsistent with professional surgical training. Related discrepancies are also apparent in the language used within the only letter from Jack considered to be probably authentic. The techniques he used to dispatch his victims and retrieve their organs were, however, highly consistent with techniques used within the slaughterhouses of the day. East London in the 1880s had a large number of small-scale slaughterhouses, within which conditions for both animals and workers were exceedingly harsh. Modern sociological research has highlighted the clear links between the infliction of violence on animals and that inflicted on humans, as well as increased risks of violent crimes in communities surrounding slaughterhouses. Conditions within modern slaughterhouses are more humane in some ways but more desensitising in others. The implications for modern animal slaughtering, and our social reliance on slaughterhouses, are explored. PMID:28394281
Modern radiosurgical and endovascular classification schemes for brain arteriovenous malformations.
Tayebi Meybodi, Ali; Lawton, Michael T
2018-05-04
Stereotactic radiosurgery (SRS) and endovascular techniques are commonly used for treating brain arteriovenous malformations (bAVMs). They are usually used as ancillary techniques to microsurgery but may also be used as solitary treatment options. Careful patient selection requires a clear estimate of the treatment efficacy and complication rates for the individual patient. As such, classification schemes are an essential part of patient selection paradigm for each treatment modality. While the Spetzler-Martin grading system and its subsequent modifications are commonly used for microsurgical outcome prediction for bAVMs, the same system(s) may not be easily applicable to SRS and endovascular therapy. Several radiosurgical- and endovascular-based grading scales have been proposed for bAVMs. However, a comprehensive review of these systems including a discussion on their relative advantages and disadvantages is missing. This paper is dedicated to modern classification schemes designed for SRS and endovascular techniques.
Advanced grazing-incidence techniques for modern soft-matter materials analysis
Hexemer, Alexander; Müller-Buschbaum, Peter
2015-01-01
The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities forin situandin operandoGISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in themore » soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.« less
Ding, Hai-quan; Lu, Qi-peng
2012-01-01
"Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.
Surface texture measurement for dental wear applications
NASA Astrophysics Data System (ADS)
Austin, R. S.; Mullen, F.; Bartlett, D. W.
2015-06-01
The application of surface topography measurement and characterization within dental materials science is highly active and rapidly developing, in line with many modern industries. Surface measurement and structuring is used extensively within oral and dental science to optimize the optical, tribological and biological performance of natural and biomimetic dental materials. Although there has historically been little standardization in the use and reporting of surface metrology instrumentation and software, the dental industry is beginning to adopt modern areal measurement and characterization techniques, especially as the dental industry is increasingly adopting digital impressioning techniques in order to leverage CAD/CAM technologies for the design and construction of dental restorations. As dental treatment becomes increasingly digitized and reliant on advanced technologies such as dental implants, wider adoption of standardized surface topography and characterization techniques will become evermore essential. The dental research community welcomes the advances that are being made in surface topography measurement science towards realizing this ultimate goal.
Advanced grazing-incidence techniques for modern soft-matter materials analysis
Hexemer, Alexander; Müller-Buschbaum, Peter
2015-01-01
The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed. PMID:25610632
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pharhizgar, K.D.; Lunce, S.E.
1994-12-31
Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less
Antweiler, Ronald C.; Taylor, Howard E.
2008-01-01
The main classes of statistical treatment of below-detection limit (left-censored) environmental data for the determination of basic statistics that have been used in the literature are substitution methods, maximum likelihood, regression on order statistics (ROS), and nonparametric techniques. These treatments, along with using all instrument-generated data (even those below detection), were evaluated by examining data sets in which the true values of the censored data were known. It was found that for data sets with less than 70% censored data, the best technique overall for determination of summary statistics was the nonparametric Kaplan-Meier technique. ROS and the two substitution methods of assigning one-half the detection limit value to censored data or assigning a random number between zero and the detection limit to censored data were adequate alternatives. The use of these two substitution methods, however, requires a thorough understanding of how the laboratory censored the data. The technique of employing all instrument-generated data - including numbers below the detection limit - was found to be less adequate than the above techniques. At high degrees of censoring (greater than 70% censored data), no technique provided good estimates of summary statistics. Maximum likelihood techniques were found to be far inferior to all other treatments except substituting zero or the detection limit value to censored data.
Analytical description of the modern steam automobile
NASA Technical Reports Server (NTRS)
Peoples, J. A.
1974-01-01
The sensitivity of operating conditions upon performance of the modern steam automobile is discussed. The word modern has been used in the title to indicate that emphasis is upon miles per gallon rather than theoretical thermal efficiency. This has been accomplished by combining classical power analysis with the ideal Pressure-Volume diagram. Several parameters are derived which characterize performance capability of the modern steam car. The report illustrates that performance is dictated by the characteristics of the working medium, and the supply temperature. Performance is nearly independent of pressures above 800 psia. Analysis techniques were developed specifically for reciprocating steam engines suitable for automotive application. Specific performance charts have been constructed on the basis of water as a working medium. The conclusions and data interpretation are therefore limited within this scope.
Recent developments in minimal processing: a tool to retain nutritional quality of food.
Pasha, Imran; Saeed, Farhan; Sultan, M Tauseef; Khan, Moazzam Rafiq; Rohi, Madiha
2014-01-01
The modernization during the last century resulted in urbanization coupled with modifications in lifestyles and dietary habits. In the same era, industrial developments made it easier to meet the requirements for processed foods. However, consumers are now interested in minimally processed foods owing to increase in their awareness to have fruits and vegetables with superior quality, and natural integrity with fewer additives. The food products deteriorate as a consequence of physiological aging, biochemical changes, high respiration rat,e and high ethylene production. These factors contribute substantially to discoloration, loss of firmness, development of off-flavors, acidification, and microbial spoilage. Simultaneously, food processors are using emerging approaches to process perishable commodities, along with enhanced nutritional and sensorial quality. The present review article is an effort to utilize the modern approaches to minimize the processing and deterioration. The techniques discussed in this paper include chlorination, ozonation, irradiation, photosensitization, edible coating, natural preservative use, high-pressure processing, microwave heating, ohmic heating, and hurdle technology. The consequences of these techniques on shelf-life stability, microbial safety, preservation of organoleptic and nutritional quality, and residue avoidance are the limelight of the paper. Moreover, the discussion has been made on the feasibility and operability of these techniques in modern-day processing.
A fast and objective multidimensional kernel density estimation method: fastKDE
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.; ...
2016-03-07
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Head-and-face shape variations of U.S. civilian workers
Zhuang, Ziqing; Shu, Chang; Xi, Pengcheng; Bergman, Michael; Joseph, Michael
2016-01-01
The objective of this study was to quantify head-and-face shape variations of U.S. civilian workers using modern methods of shape analysis. The purpose of this study was based on previously highlighted changes in U.S. civilian worker head-and-face shape over the last few decades – touting the need for new and better fitting respirators – as well as the study's usefulness in designing more effective personal protective equipment (PPE) – specifically in the field of respirator design. The raw scan three-dimensional (3D) data for 1169 subjects were parameterized using geometry processing techniques. This process allowed the individual scans to be put in correspondence with each other in such a way that statistical shape analysis could be performed on a dense set of 3D points. This process also cleaned up the original scan data such that the noise was reduced and holes were filled in. The next step, statistical analysis of the variability of the head-and-face shape in the 3D database, was conducted using Principal Component Analysis (PCA) techniques. Through these analyses, it was shown that the space of the head-and-face shape was spanned by a small number of basis vectors. Less than 50 components explained more than 90% of the variability. Furthermore, the main mode of variations could be visualized through animating the shape changes along the PCA axes with computer software in executable form for Windows XP. The results from this study in turn could feed back into respirator design to achieve safer, more efficient product style and sizing. Future study is needed to determine the overall utility of the point cloud-based approach for the quantification of facial morphology variation and its relationship to respirator performance. PMID:23399025
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, Travis A.; Kashinath, Karthik; Cavanaugh, Nicholas R.
Numerous facets of scientific research implicitly or explicitly call for the estimation of probability densities. Histograms and kernel density estimates (KDEs) are two commonly used techniques for estimating such information, with the KDE generally providing a higher fidelity representation of the probability density function (PDF). Both methods require specification of either a bin width or a kernel bandwidth. While techniques exist for choosing the kernel bandwidth optimally and objectively, they are computationally intensive, since they require repeated calculation of the KDE. A solution for objectively and optimally choosing both the kernel shape and width has recently been developed by Bernacchiamore » and Pigolotti (2011). While this solution theoretically applies to multidimensional KDEs, it has not been clear how to practically do so. A method for practically extending the Bernacchia-Pigolotti KDE to multidimensions is introduced. This multidimensional extension is combined with a recently-developed computational improvement to their method that makes it computationally efficient: a 2D KDE on 10 5 samples only takes 1 s on a modern workstation. This fast and objective KDE method, called the fastKDE method, retains the excellent statistical convergence properties that have been demonstrated for univariate samples. The fastKDE method exhibits statistical accuracy that is comparable to state-of-the-science KDE methods publicly available in R, and it produces kernel density estimates several orders of magnitude faster. The fastKDE method does an excellent job of encoding covariance information for bivariate samples. This property allows for direct calculation of conditional PDFs with fastKDE. It is demonstrated how this capability might be leveraged for detecting non-trivial relationships between quantities in physical systems, such as transitional behavior.« less
Head-and-face shape variations of U.S. civilian workers.
Zhuang, Ziqing; Shu, Chang; Xi, Pengcheng; Bergman, Michael; Joseph, Michael
2013-09-01
The objective of this study was to quantify head-and-face shape variations of U.S. civilian workers using modern methods of shape analysis. The purpose of this study was based on previously highlighted changes in U.S. civilian worker head-and-face shape over the last few decades - touting the need for new and better fitting respirators - as well as the study's usefulness in designing more effective personal protective equipment (PPE) - specifically in the field of respirator design. The raw scan three-dimensional (3D) data for 1169 subjects were parameterized using geometry processing techniques. This process allowed the individual scans to be put in correspondence with each other in such a way that statistical shape analysis could be performed on a dense set of 3D points. This process also cleaned up the original scan data such that the noise was reduced and holes were filled in. The next step, statistical analysis of the variability of the head-and-face shape in the 3D database, was conducted using Principal Component Analysis (PCA) techniques. Through these analyses, it was shown that the space of the head-and-face shape was spanned by a small number of basis vectors. Less than 50 components explained more than 90% of the variability. Furthermore, the main mode of variations could be visualized through animating the shape changes along the PCA axes with computer software in executable form for Windows XP. The results from this study in turn could feed back into respirator design to achieve safer, more efficient product style and sizing. Future study is needed to determine the overall utility of the point cloud-based approach for the quantification of facial morphology variation and its relationship to respirator performance. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Adaptation of Chain Event Graphs for use with Case-Control Studies in Epidemiology.
Keeble, Claire; Thwaites, Peter Adam; Barber, Stuart; Law, Graham Richard; Baxter, Paul David
2017-09-26
Case-control studies are used in epidemiology to try to uncover the causes of diseases, but are a retrospective study design known to suffer from non-participation and recall bias, which may explain their decreased popularity in recent years. Traditional analyses report usually only the odds ratio for given exposures and the binary disease status. Chain event graphs are a graphical representation of a statistical model derived from event trees which have been developed in artificial intelligence and statistics, and only recently introduced to the epidemiology literature. They are a modern Bayesian technique which enable prior knowledge to be incorporated into the data analysis using the agglomerative hierarchical clustering algorithm, used to form a suitable chain event graph. Additionally, they can account for missing data and be used to explore missingness mechanisms. Here we adapt the chain event graph framework to suit scenarios often encountered in case-control studies, to strengthen this study design which is time and financially efficient. We demonstrate eight adaptations to the graphs, which consist of two suitable for full case-control study analysis, four which can be used in interim analyses to explore biases, and two which aim to improve the ease and accuracy of analyses. The adaptations are illustrated with complete, reproducible, fully-interpreted examples, including the event tree and chain event graph. Chain event graphs are used here for the first time to summarise non-participation, data collection techniques, data reliability, and disease severity in case-control studies. We demonstrate how these features of a case-control study can be incorporated into the analysis to provide further insight, which can help to identify potential biases and lead to more accurate study results.
Bronner, Shaw; Bauer, Naomi G
2018-05-01
To examine risk factors for injury in pre-professional modern dancers. With prospectively designed screening and injury surveillance, we evaluated four risk factors as categorical predictors of injury: i) hypermobility; ii) dance technique motor-control; iii) muscle tightness; iv) previous injury. Screening and injury data of 180 students enrolled in a university modern dance program were reviewed over 4-yrs of training. Dancers were divided into 3-groups based on predictor scores. Dance exposure was based on hours of technique classes/wk. Negative binomial log-linear analyses were conducted with the four predictors, p < 0.05. Dancers with low and high Beighton scores were 1.43 and 1.22 times more likely to sustain injury than dancers with mid-range scores (p ≤ 0.03). Dancers with better technique (low or medium scores) were 0.86 and 0.63 times less likely to sustain injury (p = 0.013 and p < 0.001) compared to those with poor technique. Dancers with one or 2-4 tight muscles were 2.7 and 4.0 times more likely to sustain injury (p ≤ 0.046). Dancers who sustained 2-4 injuries in the previous year were 1.38 times more likely to sustain subsequent injury (p < 0.001). This contributes new information on the value of preseason screening. Dancers with these risk factors may benefit from prevention programs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Statistics for Radiology Research.
Obuchowski, Nancy A; Subhas, Naveen; Polster, Joshua
2017-02-01
Biostatistics is an essential component in most original research studies in imaging. In this article we discuss five key statistical concepts for study design and analyses in modern imaging research: statistical hypothesis testing, particularly focusing on noninferiority studies; imaging outcomes especially when there is no reference standard; dealing with the multiplicity problem without spending all your study power; relevance of confidence intervals in reporting and interpreting study results; and finally tools for assessing quantitative imaging biomarkers. These concepts are presented first as examples of conversations between investigator and biostatistician, and then more detailed discussions of the statistical concepts follow. Three skeletal radiology examples are used to illustrate the concepts. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Pressure-Assisted Chelating Extraction as a Teaching Tool in Instrumental Analysis
ERIC Educational Resources Information Center
Sadik, Omowunmi A.; Wanekaya, Adam K.; Yevgeny, Gelfand
2004-01-01
A novel instrumental-digestion technique using pressure-assisted chelating extraction (PACE), for undergraduate laboratory is reported. This procedure is used for exposing students to safe sample-preparation techniques, for correlating wet-chemical methods with modern instrumental analysis and comparing the performance of PACE with conventional…
Pastoral Techniques in the Modern Danish Educational System
ERIC Educational Resources Information Center
Nielsen, Klaus; Dalgaard, Susanne; Madsen, Sarah
2011-01-01
In recent years, therapeutic techniques have played an increasingly significant role in Danish educational thinking. One way in which this therapeutic thinking discloses itself is in the ever-growing use of educational-therapeutic games as part of the educational practice. Inspired by Foucault, we argue that educational-therapeutic games can be…
Estimating Solar PV Output Using Modern Space/Time Geostatistics (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, S. J.; George, R.; Bush, B.
2009-04-29
This presentation describes a project that uses mapping techniques to predict solar output at subhourly resolution at any spatial point, develop a methodology that is applicable to natural resources in general, and demonstrate capability of geostatistical techniques to predict the output of a potential solar plant.
Making an Old Measurement Experiment Modern and Exciting!
ERIC Educational Resources Information Center
Schulze, Paul D.
1996-01-01
Presents a new approach for the determination of the temperature coefficient of resistance of a resistor and a thermistor. Advantages include teaching students how to linearize data in order to utilize least-squares techniques, continuously taking data over desired temperature range, using up-to-date data-acquisition techniques, teaching the use…
Manual Solid-Phase Peptide Synthesis of Metallocene-Peptide Bioconjugates
ERIC Educational Resources Information Center
Kirin, Srecko I.; Noor, Fozia; Metzler-Nolte, Nils; Mier, Walter
2007-01-01
A simple and relatively inexpensive procedure for preparing a biologically active peptide using solid phase peptide synthesis (SPPS) is described. Fourth-year undergraduate students have gained firsthand experience from the solid-phase synthesis techniques and they have become familiar with modern analytical techniques based on the particular…
A New Dialogue in Ballet Pedagogy: Improving Learner Self-Sufficiency through Reflective Methodology
ERIC Educational Resources Information Center
Weidmann, Chelsea
2018-01-01
Current research into reflective pedagogy in dance almost exclusively discusses the tertiary education population. Additionally, the research is primarily focused on concert modern dance and creative dance pedagogies, techniques, and choreography. Ballet technique programs in precollegiate populations have, so far, been left out of the…
The impact of Cenozoic cooling on assemblage diversity in planktonic foraminifera
Pearson, Paul N.; Dunkley Jones, Tom; Farnsworth, Alexander; Lunt, Daniel J.; Markwick, Paul; Purvis, Andy
2016-01-01
The Cenozoic planktonic foraminifera (PF) (calcareous zooplankton) have arguably the most detailed fossil record of any group. The quality of this record allows models of environmental controls on macroecology, developed for Recent assemblages, to be tested on intervals with profoundly different climatic conditions. These analyses shed light on the role of long-term global cooling in establishing the modern latitudinal diversity gradient (LDG)—one of the most powerful generalizations in biogeography and macroecology. Here, we test the transferability of environment-diversity models developed for modern PF assemblages to the Eocene epoch (approx. 56–34 Ma), a time of pronounced global warmth. Environmental variables from global climate models are combined with Recent environment–diversity models to predict Eocene richness gradients, which are then compared with observed patterns. The results indicate the modern LDG—lower richness towards the poles—developed through the Eocene. Three possible causes are suggested for the mismatch between statistical model predictions and data in the Early Eocene: the environmental estimates are inaccurate, the statistical model misses a relevant variable, or the intercorrelations among facets of diversity—e.g. richness, evenness, functional diversity—have changed over geological time. By the Late Eocene, environment–diversity relationships were much more similar to those found today. PMID:26977064
A volumetric technique for fossil body mass estimation applied to Australopithecus afarensis.
Brassey, Charlotte A; O'Mahoney, Thomas G; Chamberlain, Andrew T; Sellers, William I
2018-02-01
Fossil body mass estimation is a well established practice within the field of physical anthropology. Previous studies have relied upon traditional allometric approaches, in which the relationship between one/several skeletal dimensions and body mass in a range of modern taxa is used in a predictive capacity. The lack of relatively complete skeletons has thus far limited the potential application of alternative mass estimation techniques, such as volumetric reconstruction, to fossil hominins. Yet across vertebrate paleontology more broadly, novel volumetric approaches are resulting in predicted values for fossil body mass very different to those estimated by traditional allometry. Here we present a new digital reconstruction of Australopithecus afarensis (A.L. 288-1; 'Lucy') and a convex hull-based volumetric estimate of body mass. The technique relies upon identifying a predictable relationship between the 'shrink-wrapped' volume of the skeleton and known body mass in a range of modern taxa, and subsequent application to an articulated model of the fossil taxa of interest. Our calibration dataset comprises whole body computed tomography (CT) scans of 15 species of modern primate. The resulting predictive model is characterized by a high correlation coefficient (r 2 = 0.988) and a percentage standard error of 20%, and performs well when applied to modern individuals of known body mass. Application of the convex hull technique to A. afarensis results in a relatively low body mass estimate of 20.4 kg (95% prediction interval 13.5-30.9 kg). A sensitivity analysis on the articulation of the chest region highlights the sensitivity of our approach to the reconstruction of the trunk, and the incomplete nature of the preserved ribcage may explain the low values for predicted body mass here. We suggest that the heaviest of previous estimates would require the thorax to be expanded to an unlikely extent, yet this can only be properly tested when more complete fossils are available. Copyright © 2017 Elsevier Ltd. All rights reserved.
CAPSAS: Computer Assisted Program for the Selection of Appropriate Statistics.
ERIC Educational Resources Information Center
Shermis, Mark D.; Albert, Susan L.
A computer-assisted program has been developed for the selection of statistics or statistical techniques by both students and researchers. Based on Andrews, Klem, Davidson, O'Malley and Rodgers "A Guide for Selecting Statistical Techniques for Analyzing Social Science Data," this FORTRAN-compiled interactive computer program was…
Welford, Mark R.; Bossak, Brian H.
2009-01-01
Background Recent studies have noted myriad qualitative and quantitative inconsistencies between the medieval Black Death (and subsequent “plagues”) and modern empirical Y. pestis plague data, most of which is derived from the Indian and Chinese plague outbreaks of A.D. 1900±15 years. Previous works have noted apparent differences in seasonal mortality peaks during Black Death outbreaks versus peaks of bubonic and pneumonic plagues attributed to Y. pestis infection, but have not provided spatiotemporal statistical support. Our objective here was to validate individual observations of this seasonal discrepancy in peak mortality between historical epidemics and modern empirical data. Methodology/Principal Findings We compiled and aggregated multiple daily, weekly and monthly datasets of both Y. pestis plague epidemics and suspected Black Death epidemics to compare seasonal differences in mortality peaks at a monthly resolution. Statistical and time series analyses of the epidemic data indicate that a seasonal inversion in peak mortality does exist between known Y. pestis plague and suspected Black Death epidemics. We provide possible explanations for this seasonal inversion. Conclusions/Significance These results add further evidence of inconsistency between historical plagues, including the Black Death, and our current understanding of Y. pestis-variant disease. We expect that the line of inquiry into the disputed cause of the greatest recorded epidemic will continue to intensify. Given the rapid pace of environmental change in the modern world, it is crucial that we understand past lethal outbreaks as fully as possible in order to prepare for future deadly pandemics. PMID:20027294
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
Rainfall erosivity: An historical review
USDA-ARS?s Scientific Manuscript database
Rainfall erosivity is the capability of rainfall to cause soil loss from hillslopes by water. Modern definitions of rainfall erosivity began with the development of the Universal Soil Loss Equation (USLE), where rainfall characteristics were statistically related to soil loss from thousands of plot...
Thermal background noise limitations
NASA Technical Reports Server (NTRS)
Gulkis, S.
1982-01-01
Modern detection systems are increasingly limited in sensitivity by the background thermal photons which enter the receiving system. Expressions for the fluctuations of detected thermal radiation are derived. Incoherent and heterodyne detection processes are considered. References to the subject of photon detection statistics are given.
Dinov, Ivo D; Sanchez, Juana; Christou, Nicolas
2008-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment.The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students' understanding and suggests better long-term knowledge retention.
Dinov, Ivo D.; Sanchez, Juana; Christou, Nicolas
2009-01-01
Technology-based instruction represents a new recent pedagogical paradigm that is rooted in the realization that new generations are much more comfortable with, and excited about, new technologies. The rapid technological advancement over the past decade has fueled an enormous demand for the integration of modern networking, informational and computational tools with classical pedagogical instruments. Consequently, teaching with technology typically involves utilizing a variety of IT and multimedia resources for online learning, course management, electronic course materials, and novel tools of communication, engagement, experimental, critical thinking and assessment. The NSF-funded Statistics Online Computational Resource (SOCR) provides a number of interactive tools for enhancing instruction in various undergraduate and graduate courses in probability and statistics. These resources include online instructional materials, statistical calculators, interactive graphical user interfaces, computational and simulation applets, tools for data analysis and visualization. The tools provided as part of SOCR include conceptual simulations and statistical computing interfaces, which are designed to bridge between the introductory and the more advanced computational and applied probability and statistics courses. In this manuscript, we describe our designs for utilizing SOCR technology in instruction in a recent study. In addition, present the results of the effectiveness of using SOCR tools at two different course intensity levels on three outcome measures: exam scores, student satisfaction and choice of technology to complete assignments. Learning styles assessment was completed at baseline. We have used three very different designs for three different undergraduate classes. Each course included a treatment group, using the SOCR resources, and a control group, using classical instruction techniques. Our findings include marginal effects of the SOCR treatment per individual classes; however, pooling the results across all courses and sections, SOCR effects on the treatment groups were exceptionally robust and significant. Coupling these findings with a clear decrease in the variance of the quantitative examination measures in the treatment groups indicates that employing technology, like SOCR, in a sound pedagogical and scientific manner enhances overall the students’ understanding and suggests better long-term knowledge retention. PMID:19750185
The global record of local iron geochemical data from Proterozoic through Paleozoic basins
NASA Astrophysics Data System (ADS)
Sperling, E. A.; Wolock, C.; Johnston, D. T.; Knoll, A. H.
2013-12-01
Iron-based redox proxies represent one of the most mature tools available to sedimentary geochemists. These techniques, which benefit from decades of refinement, are based on the fact that rocks deposited under anoxic conditions tend to be enriched in highly-reactive iron. However, there are myriad local controls on the development of anoxia, and no local section is an exemplar for the global ocean. The global signal must thus be determined using techniques like those developed to solve an analogous problem in paleobiology: the inference of global diversity patterns through time from faunas seen in local stratigraphic sections. Here we analyze a dataset of over 4000 iron speciation measurements (including over 600 de novo analyses) to better understand redox changes from the Proterozoic through the Paleozoic Era. Preliminary database analyses yield interesting observations. We find that although anoxic water columns in the middle Proterozoic were dominantly ferruginous, there was a statistical tendency towards euxinia not seen in early Neoproterozoic or Ediacaran data. Also, we find that in the Neoproterozoic oceans, oxic depositional environments-the likely home for early animals-have exceptionally low pyrite contents, and by inference low levels of porewater sulfide. This runs contrary to notions of sulfide stress on early metazoans. Finally, the current database of iron speciation data does not support an Ediacaran or Cambrian oxygenation event. This conclusion is of course only as sharp as the ability of the Fe-proxy database to track dissolved oxygen and does not rule out the possibility of a small-magnitude change in oxygen. It does suggest, however, that if changing pO2 facilitated animal diversification it did so by a limited rise past critical ecological thresholds, such as seen in the modern Oxygen Minimum Zones benthos. Oxygen increase to modern levels thus becomes a Paleozoic problem, and one in need of better sampling if a database approach is to be employed.
A review of post-modern management techniques as currently applied to Turkish forestry.
Dölarslan, Emre Sahin
2009-01-01
This paper reviews the effects of six post-modern management concepts as applied to Turkish forestry. Up to now, Turkish forestry has been constrained, both in terms of its operations and internal organization, by a highly bureaucratic system. The application of new thinking in forestry management, however, has recently resulted in new organizational and production concepts that promise to address problems specific to this Turkish industry and bring about positive changes. This paper will elucidate these specific issues and demonstrate how post-modern management thinking is influencing the administration and operational capacity of Turkish forestry within its current structure.
Improving the Reliability of Technological Subsystems Equipment for Steam Turbine Unit in Operation
NASA Astrophysics Data System (ADS)
Brodov, Yu. M.; Murmansky, B. E.; Aronson, R. T.
2017-11-01
The authors’ conception is presented of an integrated approach to reliability improving of the steam turbine unit (STU) state along with its implementation examples for the various STU technological subsystems. Basing on the statistical analysis of damage to turbine individual parts and components, on the development and application of modern methods and technologies of repair and on operational monitoring techniques, the critical components and elements of equipment are identified and priorities are proposed for improving the reliability of STU equipment in operation. The research results are presented of the analysis of malfunctions for various STU technological subsystems equipment operating as part of power units and at cross-linked thermal power plants and resulting in turbine unit shutdown (failure). Proposals are formulated and justified for adjustment of maintenance and repair for turbine components and parts, for condenser unit equipment, for regeneration subsystem and oil supply system that permit to increase the operational reliability, to reduce the cost of STU maintenance and repair and to optimize the timing and amount of repairs.
Lehto, Laura; Laaksonen, Laura; Vilkman, Erkki; Alku, Paavo
2008-03-01
The aim of this study was to investigate how different acoustic parameters, extracted both from speech pressure waveforms and glottal flows, can be used in measuring vocal loading in modern working environments and how these parameters reflect the possible changes in the vocal function during a working day. In addition, correlations between objective acoustic parameters and subjective voice symptoms were addressed. The subjects were 24 female and 8 male customer-service advisors, who mainly use telephone during their working hours. Speech samples were recorded from continuous speech four times during a working day and voice symptom questionnaires were completed simultaneously. Among the various objective parameters, only F0 resulted in a statistically significant increase for both genders. No correlations between the changes in objective and subjective parameters appeared. However, the results encourage researchers within the field of occupational voice use to apply versatile measurement techniques in studying occupational voice loading.
Developing a bubble number-density paleoclimatic indicator for glacier ice
Spencer, M.K.; Alley, R.B.; Fitzpatrick, J.J.
2006-01-01
Past accumulation rate can be estimated from the measured number-density of bubbles in an ice core and the reconstructed paleotemperature, using a new technique. Density increase and grain growth in polar firn are both controlled by temperature and accumulation rate, and the integrated effects are recorded in the number-density of bubbles as the firn changes to ice. An empirical model of these processes, optimized to fit published data on recently formed bubbles, reconstructs accumulation rates using recent temperatures with an uncertainty of 41% (P < 0.05). For modern sites considered here, no statistically significant trend exists between mean annual temperature and the ratio of bubble number-density to grain number-density at the time of pore close-off; optimum modeled accumulation-rate estimates require an eventual ???2.02 ?? 0.08 (P < 0.05) bubbles per close-off grain. Bubble number-density in the GRIP (Greenland) ice core is qualitatively consistent with independent estimates for a combined temperature decrease and accumulation-rate increase there during the last 5 kyr.
Saber, Aly
2010-01-01
Intra-abdominal adhesion formation and reformation after surgery are still an unavoidable event in spite of modern surgical techniques and are a cause of significant morbidity, resulting in infertility, pain and intestinal obstruction. To investigate the effect of honey in adhesion prevention and colonic anastomotic healing in rats. In the present study, 75 male Sprague-Dawley rats were used and divided into 3 groups for study: [25 rats for each], the intergel, honey and control groups. After the scheduled two-week's post-operative period, all survived rats were reopened for second-look laparotomy to detect the following parameters: a - adhesion, b - manometric study, c - histopathological study. The author found that the total adhesion score, the manometric values and the histopathological study among the three studied groups showed statistically significant difference and in favor of the honey-treated rats. Honey surpasses the intergel for the healing power and adhesion prevention. Copyright 2009 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
Inertial Confinement fusion targets
NASA Technical Reports Server (NTRS)
Hendricks, C. D.
1982-01-01
Inertial confinement fusion (ICF) targets are made as simple flat discs, as hollow shells or as complicated multilayer structures. Many techniques were devised for producing the targets. Glass and metal shells are made by using drop and bubble techniques. Solid hydrogen shells are also produced by adapting old methods to the solution of modern problems. Some of these techniques, problems, and solutions are discussed. In addition, the applications of many of the techniques to fabrication of ICF targets is presented.
Safety analysis in test facility design
NASA Astrophysics Data System (ADS)
Valk, A.; Jonker, R. J.
1990-09-01
The application of safety analysis techniques as developed in, for example nuclear and petrochemical industry, can be very beneficial in coping with the increasing complexity of modern test facility installations and their operations. To illustrate the various techniques available and their phasing in a project, an overview of the most commonly used techniques is presented. Two case studies are described: the hazard and operability study techniques and safety zoning in relation to the possible presence of asphyxiating atmospheres.
Paulsson, Anna K.; Holmes, Jordan A.; Peiffer, Ann M.; Miller, Lance D.; Liu, Wennuan; Xu, Jianfeng; Hinson, William H.; Lesser, Glenn J.; Laxton, Adrian W.; Tatter, Stephen B.; Debinski, Waldemar
2014-01-01
We investigate the differences in molecular signature and clinical outcomes between multiple lesion glioblastoma (GBM) and single focus GBM in the modern treatment era. Between August 2000 and May 2010, 161 patients with GBM were treated with modern radiotherapy techniques. Of this group, 33 were considered to have multiple lesion GBM (25 multifocal and 8 multicentric). Patterns of failure, time to progression and overall survival were compared based on whether the tumor was considered a single focus or multiple lesion GBM. Genomic groupings and methylation status were also investigated as a possible predictor of multifocality in a cohort of 41 patients with available tissue for analysis. There was no statistically significant difference in overall survival (p < 0.3) between the multiple lesion tumors (8.2 months) and single focus GBM (11 months). Progression free survival was superior in the single focus tumors (7.1 months) as compared to multi-focal (5.6 months, p = 0.02). For patients with single focus, multifocal and multicentric GBM, 81, 76 and 88 % of treatment failures occurred in the 60 Gy volume (p < 0.5), while 54, 72, and 38 % of treatment failures occurred in the 46 Gy volume (p < 0.4). Out of field failures were rare in both single focus and multiple foci GBM (7 vs 3 %). Genomic groupings and methylation status were not found to predict for multifocality. Patterns of failure, survival and genomic signatures for multiple lesion GBM do not appreciably differ when compared to single focus tumors. PMID:24990827
CIEL*a*b* color space predictive models for colorimetry devices--analysis of perfume quality.
Korifi, Rabia; Le Dréau, Yveline; Antinelli, Jean-François; Valls, Robert; Dupuy, Nathalie
2013-01-30
Color perception plays a major role in the consumer evaluation of perfume quality. Consumers need first to be entirely satisfied with the sensory properties of products, before other quality dimensions become relevant. The evaluation of complex mixtures color presents a challenge even for modern analytical techniques. A variety of instruments are available for color measurement. They can be classified as tristimulus colorimeters and spectrophotometers. Obsolescence of the electronics of old tristimulus colorimeter arises from the difficulty in finding repair parts and leads to its replacement by more modern instruments. High quality levels in color measurement, i.e., accuracy and reliability in color control are the major advantages of the new generation of color instrumentation, the integrating sphere spectrophotometer. Two models of spectrophotometer were tested in transmittance mode, employing the d/0° geometry. The CIEL(*)a(*)b(*) color space parameters were measured with each instrument for 380 samples of raw materials and bases used in the perfume compositions. The results were graphically compared between the colorimeter device and the spectrophotometer devices. All color space parameters obtained with the colorimeter were used as dependent variables to generate regression equations with values obtained from the spectrophotometers. The data was statistically analyzed to create predictive model between the reference and the target instruments through two methods. The first method uses linear regression analysis and the second method consists of partial least square regression (PLS) on each component. Copyright © 2012 Elsevier B.V. All rights reserved.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
Python for Information Theoretic Analysis of Neural Data
Ince, Robin A. A.; Petersen, Rasmus S.; Swan, Daniel C.; Panzeri, Stefano
2008-01-01
Information theory, the mathematical theory of communication in the presence of noise, is playing an increasingly important role in modern quantitative neuroscience. It makes it possible to treat neural systems as stochastic communication channels and gain valuable, quantitative insights into their sensory coding function. These techniques provide results on how neurons encode stimuli in a way which is independent of any specific assumptions on which part of the neuronal response is signal and which is noise, and they can be usefully applied even to highly non-linear systems where traditional techniques fail. In this article, we describe our work and experiences using Python for information theoretic analysis. We outline some of the algorithmic, statistical and numerical challenges in the computation of information theoretic quantities from neural data. In particular, we consider the problems arising from limited sampling bias and from calculation of maximum entropy distributions in the presence of constraints representing the effects of different orders of interaction in the system. We explain how and why using Python has allowed us to significantly improve the speed and domain of applicability of the information theoretic algorithms, allowing analysis of data sets characterized by larger numbers of variables. We also discuss how our use of Python is facilitating integration with collaborative databases and centralised computational resources. PMID:19242557
Marami Milani, Mohammad Reza; Hense, Andreas; Rahmani, Elham; Ploeger, Angelika
2015-01-01
This study analyzes the linear relationship between climate variables and milk components in Iran by applying bootstrapping to include and assess the uncertainty. The climate parameters, Temperature Humidity Index (THI) and Equivalent Temperature Index (ETI) are computed from the NASA-Modern Era Retrospective-Analysis for Research and Applications (NASA-MERRA) reanalysis (2002–2010). Milk data for fat, protein (measured on fresh matter bases), and milk yield are taken from 936,227 milk records for the same period, using cows fed by natural pasture from April to September. Confidence intervals for the regression model are calculated using the bootstrap technique. This method is applied to the original times series, generating statistically equivalent surrogate samples. As a result, despite the short time data and the related uncertainties, an interesting behavior of the relationships between milk compound and the climate parameters is visible. During spring only, a weak dependency of milk yield and climate variations is obvious, while fat and protein concentrations show reasonable correlations. In summer, milk yield shows a similar level of relationship with ETI, but not with temperature and THI. We suggest this methodology for studies in the field of the impacts of climate change and agriculture, also environment and food with short-term data. PMID:28231215
A Ratiometric Method for Johnson Noise Thermometry Using a Quantized Voltage Noise Source
NASA Astrophysics Data System (ADS)
Nam, S. W.; Benz, S. P.; Martinis, J. M.; Dresselhaus, P.; Tew, W. L.; White, D. R.
2003-09-01
Johnson Noise Thermometry (JNT) involves the measurement of the statistical variance of a fluctuating voltage across a resistor in thermal equilibrium. Modern digital techniques make it now possible to perform many functions required for JNT in highly efficient and predictable ways. We describe the operational characteristics of a prototype JNT system which uses digital signal processing for filtering, real-time spectral cross-correlation for noise power measurement, and a digitally synthesized Quantized Voltage Noise Source (QVNS) as an AC voltage reference. The QVNS emulates noise with a constant spectral density that is stable, programmable, and calculable in terms of known parameters using digital synthesis techniques. Changes in analog gain are accounted for by alternating the inputs between the Johnson noise sensor and the QVNS. The Johnson noise power at a known temperature is first balanced with a synthesized noise power from the QVNS. The process is then repeated by balancing the noise power from the same resistor at an unknown temperature. When the two noise power ratios are combined, a thermodynamic temperature is derived using the ratio of the two QVNS spectral densities. We present preliminary results where the ratio between the gallium triple point and the water triple point is used to demonstrate the accuracy of the measurement system with a standard uncertainty of 0.04 %.
Emancipation through interaction--how eugenics and statistics converged and diverged.
Louçã, Francisco
2009-01-01
The paper discusses the scope and influence of eugenics in defining the scientific programme of statistics and the impact of the evolution of biology on social scientists. It argues that eugenics was instrumental in providing a bridge between sciences, and therefore created both the impulse and the institutions necessary for the birth of modern statistics in its applications first to biology and then to the social sciences. Looking at the question from the point of view of the history of statistics and the social sciences, and mostly concentrating on evidence from the British debates, the paper discusses how these disciplines became emancipated from eugenics precisely because of the inspiration of biology. It also relates how social scientists were fascinated and perplexed by the innovations taking place in statistical theory and practice.
The Taguchi Method Application to Improve the Quality of a Sustainable Process
NASA Astrophysics Data System (ADS)
Titu, A. M.; Sandu, A. V.; Pop, A. B.; Titu, S.; Ciungu, T. C.
2018-06-01
Taguchi’s method has always been a method used to improve the quality of the analyzed processes and products. This research shows an unusual situation, namely the modeling of some parameters, considered technical parameters, in a process that is wanted to be durable by improving the quality process and by ensuring quality using an experimental research method. Modern experimental techniques can be applied in any field and this study reflects the benefits of interacting between the agriculture sustainability principles and the Taguchi’s Method application. The experimental method used in this practical study consists of combining engineering techniques with experimental statistical modeling to achieve rapid improvement of quality costs, in fact seeking optimization at the level of existing processes and the main technical parameters. The paper is actually a purely technical research that promotes a technical experiment using the Taguchi method, considered to be an effective method since it allows for rapid achievement of 70 to 90% of the desired optimization of the technical parameters. The missing 10 to 30 percent can be obtained with one or two complementary experiments, limited to 2 to 4 technical parameters that are considered to be the most influential. Applying the Taguchi’s Method in the technique and not only, allowed the simultaneous study in the same experiment of the influence factors considered to be the most important in different combinations and, at the same time, determining each factor contribution.
Modern control techniques in active flutter suppression using a control moment gyro
NASA Technical Reports Server (NTRS)
Buchek, P. M.
1974-01-01
Development of organized synthesis techniques, using concepts of modern control theory was studied for the design of active flutter suppression systems for two and three-dimensional lifting surfaces, utilizing a control moment gyro (CMG) to generate the required control torques. Incompressible flow theory is assumed, with the unsteady aerodynamic forces and moments for arbitrary airfoil motion obtained by using the convolution integral based on Wagner's indicial lift function. Linear optimal control theory is applied to find particular optimal sets of gain values which minimize a quadratic performance function. The closed loop system's response to impulsive gust disturbances and the resulting control power requirements are investigated, and the system eigenvalues necessary to minimize the maximum value of control power are determined.
Wang, Poguang; Giese, Roger W.
2017-01-01
Matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) has been used for quantitative analysis of small molecules for many years. It is usually preceded by an LC separation step when complex samples are tested. With the development several years ago of “modern MALDI” (automation, high repetition laser, high resolution peaks), the ease of use and performance of MALDI as a quantitative technique greatly increased. This review focuses on practical aspects of modern MALDI for quantitation of small molecules conducted in an ordinary way (no special reagents, devices or techniques for the spotting step of MALDI), and includes our ordinary, preferred Methods The review is organized as 18 recommendations with accompanying explanations, criticisms and exceptions. PMID:28118972
The Modern Design of Experiments: A Technical and Marketing Framework
NASA Technical Reports Server (NTRS)
DeLoach, R.
2000-01-01
A new wind tunnel testing process under development at NASA Langley Research Center, called Modern Design of Experiments (MDOE), differs from conventional wind tunnel testing techniques on a number of levels. Chief among these is that MDOE focuses on the generation of adequate prediction models rather than high-volume data collection. Some cultural issues attached to this and other distinctions between MDOE and conventional wind tunnel testing are addressed in this paper.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
Anderson, Katherine H.; Bartlein, Patrick J.; Strickland, Laura E.; Pelltier, Richard T.; Thompson, Robert S.; Shafer, Sarah L.
2012-01-01
The mutual climatic range (MCR) technique is perhaps the most widely used method for estimating past climatic parameters from fossil assemblages, largely because it can be conducted on a simple list of the taxa present in an assemblage. When applied to plant macrofossil data, this unweighted approach (MCRun) will frequently identify a large range for a given climatic parameter where the species in an assemblage can theoretically live together. To narrow this range, we devised a new weighted approach (MCRwt) that employs information from the modern relations between climatic parameters and plant distributions to lessen the influence of the "tails" of the distributions of the climatic data associated with the taxa in an assemblage. To assess the performance of the MCR approaches, we applied them to a set of modern climatic data and plant distributions on a 25-km grid for North America, and compared observed and estimated climatic values for each grid point. In general, MCRwt was superior to MCRun in providing smaller anomalies, less bias, and better correlations between observed and estimated values. However, by the same measures, the results of Modern Analog Technique (MAT) approaches were superior to MCRwt. Although this might be reason to favor MAT approaches, they are based on assumptions that may not be valid for paleoclimatic reconstructions, including that: 1) the absence of a taxon from a fossil sample is meaningful, 2) plant associations were largely unaffected by past changes in either levels of atmospheric carbon dioxide or in the seasonal distributions of solar radiation, and 3) plant associations of the past are adequately represented on the modern landscape. To illustrate the application of these MCR and MAT approaches to paleoclimatic reconstructions, we applied them to a Pleistocene paleobotanical assemblage from the western United States. From our examinations of the estimates of modern and past climates from vegetation assemblages, we conclude that the MCRun technique provides reliable and unbiased estimates of the ranges of possible climatic conditions that can reasonably be associated with these assemblages. The application of MCRwt and MAT approaches can further constrain these estimates and may provide a systematic way to assess uncertainty. The data sets required for MCR analyses in North America are provided in a parallel publication.
ASSESSING THE IMPACTS OF ANTHROPOGENIC STRESSORS ON MACROINVERTEBRATE INDICATORS IN OHIO
In the past few years, there has been increasing interest in using biological community data to provide information about specific anthropogenic factors impacting streams. Previous studies have used statistical approaches that are variants of classical and modern multiple regres...
Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes
Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...
Aust, H; Veltum, B; Wächtershäuser, T; Wulf, H; Eberhart, L
2014-02-01
Many anesthesia departments operate a pre-anesthesia assessment clinic (PAAC). Data regarding organization, equipment and structure of such clinics are not yet available. Information about modern anesthesiology techniques and procedures contributes to a reduction in emotional stress of the patients but such modern techniques often require additional technical hardware and costs and are not equally available. This survey examined the current structures of PAAC in the state of Hessen, demonstrated current concepts and associated these with the performance and the portfolio of procedures in these departments. An online survey was carried out. Data on structure, equipment, organization and available methods were compiled. In addition, anesthesia department personnel were asked to give individual subjective attitudes toward the premedication work. Of the anesthesia departments in Hessen 84 % participated in the survey of which 91 % operated a PAAC. A preoperative contact with the anesthesiologist who would perform anesthesia existed in only 19 % of the departments. Multimedia processing concepts for informed consent in a PAAC setting were in general rare. Many modern procedures and anesthesia techniques were broadly established independent of the hospital size. Regarding the individual and subjective attitudes of anesthetists towards the work, the psychological and medical importance of the pre-medication visit was considered to be very high. The PAACs are now well established. This may make economic sense but is accompanied by an anonymization of care in anesthesiology. The high quality, safety and availability of modern anesthesiology procedures and monitoring concepts should be communicated to patients all the more as an expression of trust and high patient safety. These factors can be facilitated in particular by multimedia tools which have as yet only been sparsely implemented in PAACs.
NASA Astrophysics Data System (ADS)
Baart, F.; van Gils, A.; Hagenaars, G.; Donchyts, G.; Eisemann, E.; van Velzen, J. W.
2016-12-01
A compelling visualization is captivating, beautiful and narrative. Here we show how melding the skills of computer graphics, art, statistics, and environmental modeling can be used to generate innovative, attractive and very informative visualizations. We focus on the topic of visualizing forecasts and measurements of water (water level, waves, currents, density, and salinity). For the field of computer graphics and arts, water is an important topic because it occurs in many natural scenes. For environmental modeling and statistics, water is an important topic because the water is essential for transport, a healthy environment, fruitful agriculture, and a safe environment.The different disciplines take different approaches to visualizing water. In computer graphics, one focusses on creating water as realistic looking as possible. The focus on realistic perception (versus the focus on the physical balance pursued by environmental scientists) resulted in fascinating renderings, as seen in recent games and movies. Visualization techniques for statistical results have benefited from the advancement in design and journalism, resulting in enthralling infographics. The field of environmental modeling has absorbed advances in contemporary cartography as seen in the latest interactive data-driven maps. We systematically review the design emerging types of water visualizations. The examples that we analyze range from dynamically animated forecasts, interactive paintings, infographics, modern cartography to web-based photorealistic rendering. By characterizing the intended audience, the design choices, the scales (e.g. time, space), and the explorability we provide a set of guidelines and genres. The unique contributions of the different fields show how the innovations in the current state of the art of water visualization have benefited from inter-disciplinary collaborations.
Population-wide distributions of neural activity during perceptual decision-making
Machens, Christian
2018-01-01
Cortical activity involves large populations of neurons, even when it is limited to functionally coherent areas. Electrophysiological recordings, on the other hand, involve comparatively small neural ensembles, even when modern-day techniques are used. Here we review results which have started to fill the gap between these two scales of inquiry, by shedding light on the statistical distributions of activity in large populations of cells. We put our main focus on data recorded in awake animals that perform simple decision-making tasks and consider statistical distributions of activity throughout cortex, across sensory, associative, and motor areas. We transversally review the complexity of these distributions, from distributions of firing rates and metrics of spike-train structure, through distributions of tuning to stimuli or actions and of choice signals, and finally the dynamical evolution of neural population activity and the distributions of (pairwise) neural interactions. This approach reveals shared patterns of statistical organization across cortex, including: (i) long-tailed distributions of activity, where quasi-silence seems to be the rule for a majority of neurons; that are barely distinguishable between spontaneous and active states; (ii) distributions of tuning parameters for sensory (and motor) variables, which show an extensive extrapolation and fragmentation of their representations in the periphery; and (iii) population-wide dynamics that reveal rotations of internal representations over time, whose traces can be found both in stimulus-driven and internally generated activity. We discuss how these insights are leading us away from the notion of discrete classes of cells, and are acting as powerful constraints on theories and models of cortical organization and population coding. PMID:23123501
NASA Astrophysics Data System (ADS)
Brandt, Douglas; Hiller, John R.; Moloney, Michael J.
1995-10-01
The Consortium for Upper Level Physics Software (CUPS) has developed a comprehensive series of Nine Book/Software packages that Wiley will publish in FY `95 and `96. CUPS is an international group of 27 physicists, all with extensive backgrounds in the research, teaching, and development of instructional software. The project is being supported by the National Science Foundation (PHY-9014548), and it has received other support from the IBM Corp., Apple Computer Corp., and George Mason University. The Simulations being developed are: Astrophysics, Classical Mechanics, Electricity & Magnetism, Modern Physics, Nuclear and Particle Physics, Quantum Mechanics, Solid State, Thermal and Statistical, and Wave and Optics.
A Case of Problematic Diffusion: The Use of Sex Determination Techniques in India.
ERIC Educational Resources Information Center
Luthra, Rashmi
1994-01-01
Discussion of model shifts in diffusion research focuses on the growth in the use of sex determination techniques in India and their consequences relating to gender and power. Topics addressed include development, underdevelopment, and modernization; the adoption of innovations; and meanings of innovations within particular social systems.…
PSYOP and Persuasion: Applying Social Psychology and Becoming an Informed Citizen
ERIC Educational Resources Information Center
King, Sara B.
2004-01-01
This project teaches students about persuasion techniques, especially as governments use them. Most project examples came from the work of the U.S. military's modern Psychological Operations division. Social psychology students (a) reviewed influence techniques; (b) examined posters, leaflets, and other persuasion tools used in World War II, the…
Modern Language Classroom Techniques. A Handbook.
ERIC Educational Resources Information Center
Allen, Edward David; Valette, Rebecca M.
The aim of this handbook is to show the teacher ways of implementing and supplementing existing materials. The suggested teaching procedures may be used in classes of varying sizes and levels, and with any method. Although the emphasis is on teacher-made materials, many of the techniques suggested may be implemented with commercial programs,…
Embedding Mixed-Reality Laboratories into E-Learning Systems for Engineering Education
ERIC Educational Resources Information Center
Al-Tikriti, Munther N.; Al-Aubidy, Kasim M.
2013-01-01
E-learning, virtual learning and mixed reality techniques are now a global integral part of the academic and educational systems. They provide easier access to educational opportunities to a very wide spectrum of individuals to pursue their educational and qualification objectives. These modern techniques have the potentials to improve the quality…
NASA Astrophysics Data System (ADS)
Bae, Albert; Westendorf, Christian; Erlenkamper, Christoph; Galland, Edouard; Franck, Carl; Bodenschatz, Eberhard; Beta, Carsten
2010-03-01
Eukaryotic cell flattening is valuable for improving microscopic observations, ranging from bright field to total internal reflection fluorescence microscopy. In this talk, we will discuss traditional overlay techniques, and more modern, microfluidic based flattening, which provides a greater level of control. We demonstrate these techniques on the social amoebae Dictyostelium discoideum, comparing the advantages and disadvantages of each method.
Rapid purification of fluorescent enzymes by ultrafiltration
NASA Technical Reports Server (NTRS)
Benjaminson, M. A.; Satyanarayana, T.
1983-01-01
In order to expedite the preparation of fluorescently tagged enzymes for histo-cyctochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.
Rapid purification of fluorescent enzymes by ultrafiltration
NASA Technical Reports Server (NTRS)
Benjaminson, M. A.; Satyanarayana, T.
1983-01-01
In order to expedite the preparation of fluorescently tagged enzymes for histo/cytochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.
The Throws: Contemporary Theory, Technique and Training.
ERIC Educational Resources Information Center
Wilt, Fred, Ed.
This compilation of articles covers the subject of four throwing events--shot put, discus throw, hammer throw, and javelin throw. The history of the art and science of throwing is traced from ancient to modern times in the introduction. Theories on training and techniques of throwing are presented in essays contributed by coaches from the United…
Technologies of Student Testing for Learning Quality Evaluation in the System of Higher Education
ERIC Educational Resources Information Center
Bayukova, Nadezhda Olegovna; Kareva, Ludmila Alexandrovna; Rudometova, Liliya Tarasovna; Shlangman, Marina Konstantinovna; Yarantseva, Natalia Vladislavovna
2015-01-01
The paper deals with technology of students' achievement in the area of educational activities, methods, techniques, forms and conditions of monitoring knowledge quality in accordance with the requirements of Russian higher education system modernization. The authors propose methodic techniques of students' training for testing based on innovative…
Bridging the Gap between Basic and Clinical Sciences: A Description of a Radiological Anatomy Course
ERIC Educational Resources Information Center
Torres, Anna; Staskiewicz, Grzegorz J.; Lisiecka, Justyna; Pietrzyk, Lukasz; Czekajlo, Michael; Arancibia, Carlos U.; Maciejewski, Ryszard; Torres, Kamil
2016-01-01
A wide variety of medical imaging techniques pervade modern medicine, and the changing portability and performance of tools like ultrasound imaging have brought these medical imaging techniques into the everyday practice of many specialties outside of radiology. However, proper interpretation of ultrasonographic and computed tomographic images…
Optics for Processes, Products and Metrology
NASA Astrophysics Data System (ADS)
Mather, George
1999-04-01
Optical physics has a variety of applications in industry, including process inspection, coatings development, vision instrumentation, spectroscopy, and many others. Optics has been used extensively in the design of solar energy collection systems and coatings, for example. Also, with the availability of good CCD cameras and fast computers, it has become possible to develop real-time inspection and metrology devices that can accommodate the high throughputs encountered in modern production processes. More recently, developments in moiré interferometry show great promise for applications in the basic metals and electronics industries. The talk will illustrate applications of optics by discussing process inspection techniques for defect detection, part dimensioning, birefringence measurement, and the analysis of optical coatings in the automotive, glass, and optical disc industries. In particular, examples of optical techniques for the quality control of CD-R, MO, and CD-RW discs will be presented. In addition, the application of optical concepts to solar energy collector design and to metrology by moiré techniques will be discussed. Finally, some of the modern techniques and instruments used for qualitative and quantitative material analysis will be presented.
Seismic Source Identification Techniques
various fields of endeavor in theoretical and experimental seismology and the establishment of a modern geophysical observatory near Eilat, Israel, which includes strainmeters, tiltmeters and high-gain displacement-meters.
Brenke, Christopher; Lassel, Elke A; Terris, Darcey; Kurt, Aysel; Schmieder, Kirsten; Schoenberg, Stefan O; Weisser, Gerald
2014-05-01
A significant proportion of acute care neurosurgical patients present to hospital outside regular working hours. The objective of our study was to evaluate the structure of neurosurgical on-call services in Germany, the use of modern communication devices and teleradiology services, and the personal acceptance of modern technologies by neurosurgeons. A nationwide survey of all 141 neurosurgical departments in Germany was performed. The questionnaire consisted of two parts: one for neurosurgical departments and one for individual neurosurgeons. The questionnaire, available online and mailed in paper form, included 21 questions about on-call service structure; the availability and use of communication devices, teleradiology services, and other information services; and neurosurgeons' personal acceptance of modern technologies. The questionnaire return rate from departments was 63.1% (89/141), whereas 187 individual neurosurgeons responded. For 57.3% of departments, teleradiology services were available and were frequently used by 62.2% of neurosurgeons. A further 23.6% of departments described using smartphone screenshots of computed tomography (CT) images transmitted by multimedia messaging service (MMS), and 8.6% of images were described as sent by unencrypted email. Although 47.0% of neurosurgeons reported owning a smartphone, only 1.1% used their phone for on-call image communication. Teleradiology services were observed to be widely used by on-call neurosurgeons in Germany. Nevertheless, a significant number of departments appear to use outdated techniques or techniques that leave patient data unprotected. On-call neurosurgeons in Germany report a willingness to adopt more modern approaches, utilizing readily available smartphones or tablet technology. Georg Thieme Verlag KG Stuttgart · New York.
Czech, Hendryk; Miersch, Toni; Orasche, Jürgen; Abbaszade, Gülcin; Sippula, Olli; Tissari, Jarkko; Michalke, Bernhard; Schnelle-Kreis, Jürgen; Streibel, Thorsten; Jokiniemi, Jorma; Zimmermann, Ralf
2018-01-15
Combustion technologies of small-scale wood combustion appliances are continuously developed decrease emissions of various pollutants and increase energy conversion. One strategy to reduce emissions is the implementation of air staging technology in secondary air supply, which became an established technique for modern wood combustion appliances. On that account, emissions from a modern masonry heater fuelled with three types of common logwood (beech, birch and spruce) and a modern pellet boiler fuelled with commercial softwood pellets were investigated, which refer to representative combustion appliances in northern Europe In particular, emphasis was put on the organic constituents of PM2.5, including polycyclic aromatic hydrocarbons (PAHs), oxygenated PAHs (OPAHs) and phenolic species, by targeted and non-targeted mass spectrometric analysis techniques. Compared to conventional wood stoves and pellet boilers, organic emissions from the modern appliances were reduced by at least one order of magnitude, but to a different extent for single species. Hence, characteristic ratios of emission constituents and emission profiles for wood combustion identification and speciation do not hold for this type of advanced combustion technology. Additionally, an overall substantial reduction of typical wood combustion markers, such as phenolic species and anhydrous sugars, were observed. Finally, it was found that slow ignition of log woods changes the distribution of characteristic resin acids and phytosterols as well as their thermal alteration products, which are used as markers for specific wood types. Our results should be considered for wood combustion identification in positive matrix factorisation or chemical mass balance in northern Europe. Copyright © 2017 Elsevier B.V. All rights reserved.
Facial ontogeny in Neanderthals and modern humans
Bastir, Markus; O'Higgins, Paul; Rosas, Antonio
2007-01-01
One hundred and fifty years after the discovery of Neanderthals, it is held that this morphologically and genetically distinct human species does not differ from modern Homo sapiens in its craniofacial ontogenetic trajectory after the early post-natal period. This is striking given the evident morphological differences between these species, since it implies that all of the major differences are established by the early post-natal period and carried into adulthood through identical trajectories, despite the extent to which mechanical and spatial factors are thought to influence craniofacial ontogeny. Here, we present statistical and morphological analyses demonstrating that the spatio-temporal processes responsible for craniofacial ontogenetic transformations differ. The findings emphasize that pre-natal as well as post-natal ontogeny are both important in establishing the cranial morphological differences between adult Neanderthals and modern humans. PMID:17311777
NASA Astrophysics Data System (ADS)
Schüler, L.; Hemp, A.; Behling, H.
2014-01-01
The relationship between modern pollen-rain taxa and measured climate variables was explored along the elevational gradient of the southern slope of Mt. Kilimanjaro, Tanzania. Pollen assemblages in 28 pollen traps positioned on 14 montane forest vegetation plots were identified and their relationship with climate variables was examined using multivariate statistical methods. Canonical correspondence analysis revealed that the mean annual temperature, mean annual precipitation and minimum temperature each account for significant fractions of the variation in pollen taxa. A training set of 107 modern pollen taxa was used to derive temperature and precipitation transfer functions based on pollen subsets using weighted-averaging-partial-least-squares (WA-PLS) techniques. The transfer functions were then applied to a fossil pollen record from the montane forest of Mt. Kilimanjaro and the climate parameter estimates for the Late Glacial and the Holocene on Mt. Kilimanjaro were inferred. Our results present the first quantitatively reconstructed temperature and precipitation estimates for Mt Kilimanjaro and give highly interesting insights into the past 45 000 yr of climate dynamics in tropical East Africa. The climate reconstructions are consistent with the interpretation of pollen data in terms of vegetation and climate history of afro-montane forest in East Africa. Minimum temperatures above the frostline as well as increased precipitation turn out to be crucial for the development and expansion of montane forest during the Holocene. In contrast, consistently low minimum temperatures as well as about 25% drier climate conditions prevailed during the pre LGM, which kept the montane vegetation composition in a stable state.
Accuracy and coverage of the modernized Polish Maritime differential GPS system
NASA Astrophysics Data System (ADS)
Specht, Cezary
2011-01-01
The DGPS navigation service augments The NAVSTAR Global Positioning System by providing localized pseudorange correction factors and ancillary information which are broadcast over selected marine reference stations. The DGPS service position and integrity information satisfy requirements in coastal navigation and hydrographic surveys. Polish Maritime DGPS system has been established in 1994 and modernized (in 2009) to meet the requirements set out in IMO resolution for a future GNSS, but also to preserve backward signal compatibility of user equipment. Having finalized installation of the new technology L1, L2 reference equipment performance tests were performed.The paper presents results of the coverage modeling and accuracy measuring campaign based on long-term signal analyses of the DGPS reference station Rozewie, which was performed for 26 days in July 2009. Final results allowed to verify the coverage area of the differential signal from reference station and calculated repeatable and absolute accuracy of the system, after the technical modernization. Obtained field strength level area and position statistics (215,000 fixes) were compared to past measurements performed in 2002 (coverage) and 2005 (accuracy), when previous system infrastructure was in operation.So far, no campaigns were performed on differential Galileo. However, as signals, signal processing and receiver techniques are comparable to those know from DGPS. Because all satellite differential GNSS systems use the same transmission standard (RTCM), maritime DGPS Radiobeacons are standardized in all radio communication aspects (frequency, binary rate, modulation), then the accuracy results of differential Galileo can be expected as a similar to DGPS.Coverage of the reference station was calculated based on unique software, which calculate the signal strength level based on transmitter parameters or field signal strength measurement campaign, done in the representative points. The software works based on Baltic sea vector map, ground electric parameters and models atmospheric noise level in the transmission band.
A comparative study of modern and fossil cone scales and seeds of conifers: A geochemical approach
Artur, Stankiewicz B.; Mastalerz, Maria; Kruge, M.A.; Van Bergen, P. F.; Sadowska, A.
1997-01-01
Modern cone scales and seeds of Pinus strobus and Sequoia sempervirens, and their fossil (Upper Miocene, c. 6 Mar) counterparts Pinus leitzii and Sequoia langsdorfi have been studied using pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS), electron-microprobe and scanning electron microscopy. Microscopic observations revealed only minor microbial activity and high-quality structural preservation of the fossil material. The pyrolysates of both modern genera showed the presence of ligno-cellulose characteristic of conifers. However, the abundance of (alkylated)phenols and 1,2-benzenediols in modern S. sempervirens suggests the presence of non-hydrolysable tannins or abundant polyphenolic moieties not previously reported in modern conifers. The marked differences between the pyrolysis products of both modern genera are suggested to be of chemosystematic significance. The fossil samples also contained ligno-cellulose which exhibited only partial degradation, primarily of the carbohydrate constituents. Comparison between the fossil cone scale and seed pyrolysates indicated that the ligno-cellulose complex present in the seeds is chemically more resistant than that in the cone scales. Principal component analysis (PCA) of the pyrolysis data allowed for the determination of the discriminant functions used to assess the extent of degradation and the chemosystematic differences between both genera and between cone scales and seeds. Elemental composition (C, O, S), obtained using electron-microprobe, corroborated the pyrolysis results. Overall, the combination of chemical, microscopic and statistical methods allowed for a detailed characterization and chemosystematic interpretations of modern and fossil conifer cone scales and seeds.
NASA Astrophysics Data System (ADS)
Mezzino, D.; Pei, W.; Santana Quintero, M.; Reyes Rodriguez, R.
2015-08-01
This contribution describes the results of an International workshop on documentation of historic and cultural heritage developed jointly between Universidad de Guadalajara's Centro Universitario de Arte, Arquitectura y Diseño (CUAAD) and Carleton University's Architectural Conservation and Sustainability Program. The objective of the workshop was to create a learning environment for emerging heritage professionals through the use of advanced recording techniques for the documentation of modern architectural heritage in Guadalajara, Mexico. The selected site was Casa Cristo, one of the several architectural projects by Luis Barragán in Guadalajara. The house was built between 1927 and 1929 for Gustavo R. Cristo, mayor of the city. The style of the building reflects the European influences derived from the architect's travel experience, as well as the close connection with local craftsmanship. All of these make the house an outstanding example of modern regional architecture. A systematic documentation strategy was developed for the site, using different survey equipment and techniques to capture the shape, colour, spatial configuration, and current conditions of Casa Cristo for its eventual rehabilitation and conservation.
Wronkiewicz, Mark; Larson, Eric; Lee, Adrian Kc
2016-10-01
Brain-computer interface (BCI) technology allows users to generate actions based solely on their brain signals. However, current non-invasive BCIs generally classify brain activity recorded from surface electroencephalography (EEG) electrodes, which can hinder the application of findings from modern neuroscience research. In this study, we use source imaging-a neuroimaging technique that projects EEG signals onto the surface of the brain-in a BCI classification framework. This allowed us to incorporate prior research from functional neuroimaging to target activity from a cortical region involved in auditory attention. Classifiers trained to detect attention switches performed better with source imaging projections than with EEG sensor signals. Within source imaging, including subject-specific anatomical MRI information (instead of using a generic head model) further improved classification performance. This source-based strategy also reduced accuracy variability across three dimensionality reduction techniques-a major design choice in most BCIs. Our work shows that source imaging provides clear quantitative and qualitative advantages to BCIs and highlights the value of incorporating modern neuroscience knowledge and methods into BCI systems.
Statistical Thermodynamics and Microscale Thermophysics
NASA Astrophysics Data System (ADS)
Carey, Van P.
1999-08-01
Many exciting new developments in microscale engineering are based on the application of traditional principles of statistical thermodynamics. In this text Van Carey offers a modern view of thermodynamics, interweaving classical and statistical thermodynamic principles and applying them to current engineering systems. He begins with coverage of microscale energy storage mechanisms from a quantum mechanics perspective and then develops the fundamental elements of classical and statistical thermodynamics. Subsequent chapters discuss applications of equilibrium statistical thermodynamics to solid, liquid, and gas phase systems. The remainder of the book is devoted to nonequilibrium thermodynamics of transport phenomena and to nonequilibrium effects and noncontinuum behavior at the microscale. Although the text emphasizes mathematical development, Carey includes many examples and exercises to illustrate how the theoretical concepts are applied to systems of scientific and engineering interest. In the process he offers a fresh view of statistical thermodynamics for advanced undergraduate and graduate students, as well as practitioners, in mechanical, chemical, and materials engineering.
Secondary School Mathematics Curriculum Improvement Study Information Bulletin 7.
ERIC Educational Resources Information Center
Secondary School Mathematics Curriculum Improvement Study, New York, NY.
The background, objectives, and design of Secondary School Mathematics Curriculum Improvement Study (SSMCIS) are summarized. Details are given of the content of the text series, "Unified Modern Mathematics," in the areas of algebra, geometry, linear algebra, probability and statistics, analysis (calculus), logic, and computer…
Development of a Relay Performance Web Tool for the Mars Network
NASA Technical Reports Server (NTRS)
Allard, Daniel A.; Edwards, Charles D.
2009-01-01
Modern Mars surface missions rely upon orbiting spacecraft to relay communications to and from Earth systems. An important component of this multi-mission relay process is the collection of relay performance statistics supporting strategic trend analysis and tactical anomaly identification and tracking.
1987-09-09
Legal Department, Marek Wieczorkiewicz , said that the speed with which antisocial behavior and public discipline is being corrected depends on the...Wojtas. Piotr Lulka: "The statement that there are 700 centers in the region is only a statistic. The centers must be modernized. Those who want to
Beyond description. Comment on "Approaching human language with complex networks" by Cong and Liu
NASA Astrophysics Data System (ADS)
Ferrer-i-Cancho, R.
2014-12-01
In their historical overview, Cong & Liu highlight Sausurre as the father of modern linguistics [1]. They apparently miss G.K. Zipf as a pioneer of the view of language as a complex system. His idea of a balance between unification and diversification forces in the organization of natural systems, e.g., vocabularies [2], can be seen as a precursor of the view of complexity as a balance between order (unification) and disorder (diversification) near the edge of chaos [3]. Although not mentioned by Cong & Liu somewhere else, trade-offs between hearer and speaker needs are very important in Zipf's view, which has inspired research on the optimal networks mapping words into meanings [4-6]. Quantitative linguists regard G.K. Zipf as the funder of modern quantitative linguistics [7], a discipline where statistics plays a central role as in network science. Interestingly, that centrality of statistics is missing Saussure's work and that of many of his successors.
Comparative Analysis Between Computed and Conventional Inferior Alveolar Nerve Block Techniques.
Araújo, Gabriela Madeira; Barbalho, Jimmy Charles Melo; Dias, Tasiana Guedes de Souza; Santos, Thiago de Santana; Vasconcellos, Ricardo José de Holanda; de Morais, Hécio Henrique Araújo
2015-11-01
The aim of this randomized, double-blind, controlled trial was to compare the computed and conventional inferior alveolar nerve block techniques in symmetrically positioned inferior third molars. Both computed and conventional anesthetic techniques were performed in 29 healthy patients (58 surgeries) aged between 18 and 40 years. The anesthetic of choice was 2% lidocaine with 1: 200,000 epinephrine. The Visual Analogue Scale assessed the pain variable after anesthetic infiltration. Patient satisfaction was evaluated using the Likert Scale. Heart and respiratory rates, mean time to perform technique, and the need for additional anesthesia were also evaluated. Pain variable means were higher for the conventional technique as compared with computed, 3.45 ± 2.73 and 2.86 ± 1.96, respectively, but no statistically significant differences were found (P > 0.05). Patient satisfaction showed no statistically significant differences. The average computed technique runtime and the conventional were 3.85 and 1.61 minutes, respectively, showing statistically significant differences (P <0.001). The computed anesthetic technique showed lower mean pain perception, but did not show statistically significant differences when contrasted to the conventional technique.
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
The role of empirical Bayes methodology as a leading principle in modern medical statistics.
van Houwelingen, Hans C
2014-11-01
This paper reviews and discusses the role of Empirical Bayes methodology in medical statistics in the last 50 years. It gives some background on the origin of the empirical Bayes approach and its link with the famous Stein estimator. The paper describes the application in four important areas in medical statistics: disease mapping, health care monitoring, meta-analysis, and multiple testing. It ends with a warning that the application of the outcome of an empirical Bayes analysis to the individual "subjects" is a delicate matter that should be handled with prudence and care. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The health of the American slave examined by means of Union Army medical statistics.
Freemon, F R
1985-01-01
The health status of the American slave in the 19th century remains unclear despite extensive historical research. Better knowledge of slave health would provide a clearer picture of the life of the slave, a better understanding of the 19th-century medicine, and possibly even clues to the health problems of modern blacks. This article hopes to contribute to the literature by examining another source of data. Slaves entering the Union Army joined an organization with standardized medical care that generated extensive statistical information. Review of these statistics answers questions about the health of young male blacks at the time American slavery ended.
ERIC Educational Resources Information Center
Department of Education and Science, London (England).
A study investigated techniques and practices for teaching second languages (French, German, Spanish) in 25 urban schools in different areas of England. It was found that the overall quality of work in modern languages was very good in 1 school, good in 5, satisfactory in 7, less than satisfactory in 10, and poor in 2. Three of 10 lessons seen…
Mukherjee, Sudeshna Basu; Ray, Anjali
2009-01-01
Background: The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Materials and Methods: Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for ‘t’ and ANOVA. Results: Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them. PMID:21180486
Mukherjee, Sudeshna Basu; Ray, Anjali
2009-07-01
The present study was firstly aimed to find out the nature of stressful life events arising out of the innovative challenges in modernized organizations; and secondly, it tried to identify the relationship between innovative work behavior of managers and the levels of stress arising out of stressful events in modernized organizations (public and private) in West Bengal. Data was collected from a sample of 200 managers, by using 3 tools (General Information Schedule, Life Event Inventory and Innovative Work Behavior Scale) through a face-to-face interview. Responses were subjected to both quantitative and qualitative analyses. The data was statistically treated for 't' and ANOVA. Data highlighted the fact that the qualitative profile of stressful events in the lives of managers expressed specificity in terms of their organizational type (public- and private-sector modernized organizations), and levels of stress from stressful life events were significantly higher among the modernized private-sector managers than those among public-sector managers. The prevalence of innovative work behavior was moderately higher among managers of private-sector modernized organizations than their counterparts in public-sector organizations. The trends of innovative work behavior of the managers indicated much variability due to interaction of their level of perceived stressful challenges for innovation and the global forces of change that have unleashed dynamic, systematic and higher expectation level from them.
A New Approach to Business Writing.
ERIC Educational Resources Information Center
Egan, Michael
1998-01-01
Explains how business writing can be taught using examples from modern literature and the analytical tools of literary criticism. Uses Michener, Hemingway, Faulkner, and Steinbeck to illustrate techniques. (SK)
Behaviour change techniques and contraceptive use in low and middle income countries: a review.
Phiri, Mwelwa; King, R; Newell, J N
2015-10-30
We aimed to identify effective behaviour change techniques to increase modern contraceptive use in low and middle income countries (LMICs). Literature was identified in Global Health, Web of Science, MEDLINE, PsycINFO and Popline, as well as peer reviewed journals. Articles were included if they were written in English, had an outcome evaluation of contraceptive use, modern contraceptive use, contraceptive initiation/uptake, contraceptive adherence or continuation of contraception, were a systematic review or randomised controlled trial, and were conducted in a low or middle income country. We assessed the behaviour change techniques used in each intervention and included a new category of male partner involvement. We identified six studies meeting the inclusion criteria. The most effective interventions were those that involve male partner involvement in the decision to initiate contraceptive use. The findings also suggest that providing access to contraceptives in the community promotes their use. The interventions that had positive effects on contraceptive use used a combination of behaviour change techniques. Performance techniques were not used in any of the interventions. The use of social support techniques, which are meant to improve wider social acceptability, did not appear except in two of the interventions. Our findings suggest that when information and contraceptives are provided, contraceptive use improves. Recommendations include reporting of behaviour change studies to include more details of the intervention and techniques employed. There is also a need for further research to understand which techniques are especially effective.
Determining Gender by Raman Spectroscopy of a Bloodstain.
Sikirzhytskaya, Aliaksandra; Sikirzhytski, Vitali; Lednev, Igor K
2017-02-07
The development of novel methods for forensic science is a constantly growing area of modern analytical chemistry. Raman spectroscopy is one of a few analytical techniques capable of nondestructive and nearly instantaneous analysis of a wide variety of forensic evidence, including body fluid stains, at the scene of a crime. In this proof-of-concept study, Raman microspectroscopy was utilized for gender identification based on dry bloodstains. Raman spectra were acquired in mapping mode from multiple spots on a bloodstain to account for intrinsic sample heterogeneity. The obtained Raman spectroscopic data showed highly similar spectroscopic features for female and male blood samples. Nevertheless, support vector machines (SVM) and artificial neuron network (ANN) statistical methods applied to the spectroscopic data allowed for differentiating between male and female bloodstains with high confidence. More specifically, the statistical approach based on a genetic algorithm (GA) coupled with an ANN classification showed approximately 98% gender differentiation accuracy for individual bloodstains. These results demonstrate the great potential of the developed method for forensic applications, although more work is needed for method validation. When this method is fully developed, a portable Raman instrument could be used for the infield identification of traces of body fluids and to obtain phenotypic information about the donor, including gender and race, as well as for the analysis of a variety of other types of forensic evidence.
NASA Astrophysics Data System (ADS)
Fan, X.; Chen, L.; Ma, Z.
2010-12-01
Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.
[Multifocal intraocular lenses. A review].
Auffarth, G U; Dick, H B
2001-02-01
Modern cataract surgery has developed tremendously during the past 10-15 years. Improved surgical techniques, as well as improved implant materials and designs, have enlarged patient profiles and indications for cataract surgery. This also created much higher expectations from the patients' site. The loss of accommodation is loss of quality of life for presbyopic and especially young pseudophakic patients. Therefore cataract surgery with multifocal IOL implantation is not only of academic interest, but reflects demands and expectations of our patients. Multifocal IOLs have been implanted since 1986, starting with 2-3 zone refractive and diffractive designs. Due to surgical techniques of that time MIOL decentration and surgically induced astigmatism were possible complications. In addition reduced contrast sensitivity and increased glare were common problems of MIOL because of their optical principles. New developments in this field in recent years such as the multizonal, progressive refractive MIOL in combination with improved surgical techniques have overcome those initial problems. Therefore, modern multifocal IOLs can be considered not only for correction of aphakia but also for refractive purposes.
The Political Persuaders; The Techniques of Modern Election Campaigns.
ERIC Educational Resources Information Center
Nimmo, Dan
Over the last 20 years, a successful election campaign has come to depend in large part on successful use of the broadcast media. As a result, media experts are part of most politicians' teams, and their strategies help determine the results of the election. Usually, themes or "images" are more important than issues. The techniques of mass…
[MLPA technique--principles and use in practice].
Rusu, Cristina; Sireteanu, Adriana; Puiu, Maria; Skrypnyk, Cristina; Tomescu, E; Csep, Katalin; Creţ, Victoria; Barbarii, Ligia
2007-01-01
MLPA (Multiplex Ligation-dependent Probe Amplification) is a recently introduced method, based on PCR principle, useful for the detection of different genetic abnormalities (aneuploidies, gene deletions/duplications, subtelomeric rearrangements, methylation status etc). The technique is simple, reliable and cheap. We present this method to discuss its importance for a modern genetic service and to underline its multiple advantages.
Benefits of advanced software techniques for mission planning systems
NASA Technical Reports Server (NTRS)
Gasquet, A.; Parrod, Y.; Desaintvincent, A.
1994-01-01
The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.
Benefits of advanced software techniques for mission planning systems
NASA Astrophysics Data System (ADS)
Gasquet, A.; Parrod, Y.; Desaintvincent, A.
1994-10-01
The increasing complexity of modern spacecraft, and the stringent requirement for maximizing their mission return, call for a new generation of Mission Planning Systems (MPS). In this paper, we discuss the requirements for the Space Mission Planning and the benefits which can be expected from Artificial Intelligence techniques through examples of applications developed by Matra Marconi Space.
Classroom Techniques: Foreign Languages and English as Second Language.
ERIC Educational Resources Information Center
Allen, Edward David; Valette, Rebecca M.
The aim of the handbook, which is a revised and expanded edition of "Modern Language Classroom Techniques" (1972), is to show the teacher ways of implementing and supplementing existing materials. The suggested teaching procedures may be used with classes of varying sizes and levels, and with any method. Part One of this handbook presents an…
Uzunovic, Slavoljub; Kostic, Radmila; Zivkovic, Dobrica
2010-09-01
This study aimed to determine the effects of two different programs of modern sports dancing on coordination, strength, and speed in 60 beginner-level female dancers, aged 13 and 14 yrs. The subjects were divided into two experimental groups (E1 and E2), each numbering 30 subjects, drawn from local dance clubs. In order to determine motor coordination, strength, and speed, we used 15 measurements. The groups were tested before and after the experimental programs. Both experimental programs lasted for 18 wks, with training sessions twice a week for 60 minutes. The subjects from the E1 group trained according to a new experimental program of disco dance (DD) modern sports dance, and the E2 group trained according to the classic DD program of the same kind for beginner selections. The obtained results were assessed by statistical analysis: a paired-samples t-test and MANCOVA/ANCOVA. The results indicated that following the experimental programs, both groups showed a statistically significant improvement in the evaluated skills, but the changes among the E1 group subjects were more pronounced. The basic assumption of this research was confirmed, that the new experimental DD program has a significant influence on coordination, strength, and speed. In relation to these changes, the application of the new DD program was recommended for beginner dancers.
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.
NASA Technical Reports Server (NTRS)
Yeomans, D. K. (Editor); West, R. M. (Editor); Harrington, R. S. (Editor); Marsden, B. G. (Editor)
1984-01-01
Modern techniques for making cometary astrometric observations, reducing these observations, using accurate reference star catalogs, and computing precise orbits and ephemerides are discussed in detail and recommendations and suggestions are given in each area.
The modern role of transoesophageal echocardiography in the assessment of valvular pathologies
Bull, Sacha; Newton, James
2017-01-01
Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed the detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves and provide guidance for the evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. PMID:28096184
The modern role of transoesophageal echocardiography in the assessment of valvular pathologies.
Wamil, Malgorzata; Bull, Sacha; Newton, James
2017-01-17
Despite significant advancements in the field of cardiovascular imaging, transoesophageal echocardiography remains the key imaging modality in the management of valvular pathologies. This paper provides echocardiographers with an overview of the modern role of TOE in the diagnosis and management of valvular disease. We describe how the introduction of 3D techniques has changed detection and grading of valvular pathologies and concentrate on its role as a monitoring tool in interventional cardiology. In addition, we focus on the echocardiographic and Doppler techniques used in the assessment of prosthetic valves, and provide guidance for evaluation of prosthetic valves. Finally, we summarise quantitative methods used for the assessment of valvular stenosis and regurgitation and highlight the key areas where echocardiography remains superior over other novel imaging modalities. © 2017 The authors.
The use of modern measurement techniques for designing pro ecological constructions
NASA Astrophysics Data System (ADS)
Wieczorowski, Michał; Gapiński, Bartosz; Szymański, Maciej; Rękas, Artur
2017-10-01
In the paper some possibilities of application modern length and angle metrology techniques to design constructions that support ecology were presented. The paper is based on a project where a lighter bus and train car seat was developed. Different options were presented including static and dynamic photogrammetry, computed tomography and thermography. Research related with dynamic behaviour of designed structures gave input to determine deformation of a seat and passengers sitting on it during communication accidents. Works connected to strength of construction elements made it possible to optimize its dimensions maintaining proper durability. Metrological actions taken in relation to production machines and equipment enabled to better recognize phenomena that take place during manufacturing process and to correct its parameters, what in turns also contributed to slim down the construction.
PET/CT in Radiation Therapy Planning.
Specht, Lena; Berthelsen, Anne Kiil
2018-01-01
Radiation therapy (RT) is an important component of the management of lymphoma patients. Most lymphomas are metabolically active and accumulate 18 F-fluorodeoxyglucose (FDG). Positron emission tomography with computer tomography (PET/CT) imaging using FDG is used routinely in staging and treatment evaluation. FDG-PET/CT imaging is now also used routinely for contouring the target for RT, and has been shown to change the irradiated volume significantly compared with CT imaging alone. Modern advanced imaging techniques with image fusion and motion management in combination with modern highly conformal RT techniques have increased the precision of RT, and have made it possible to reduce dramatically the risks of long-term side effects of treatment while maintaining the high cure rates for these diseases. Copyright © 2017 Elsevier Inc. All rights reserved.
Yates, Piers J; Quraishi, Nasir A; Kop, Allan; Howie, Donald W; Marx, Clare; Swarts, Eric
2008-02-01
We present 14 cases of fracture of modern, high-nitrogen, stainless steel stems. Our clinical and radiological data suggest that heavy patients with small stems and poor proximal support are at risk for fracturing their implants. "Champagne-glass" canals can lead to the use of smaller stems often placed in varus, which can lead to cantilever bending and fatigue failure in the distal half of the stem. Metallurgical assessment of the retrieved high-nitrogen, stainless steel stems reveals microstructural inconsistencies that may contribute to their failure. Based on our findings, careful consideration and attention to technique is required when using stainless steel stems in patients with high body mass index or high weight. Technique is particularly important in femurs with champagne-glass canals.
Descriptive Statistical Techniques for Librarians. 2nd Edition.
ERIC Educational Resources Information Center
Hafner, Arthur W.
A thorough understanding of the uses and applications of statistical techniques is integral in gaining support for library funding or new initiatives. This resource is designed to help practitioners develop and manipulate descriptive statistical information in evaluating library services, tracking and controlling limited resources, and analyzing…
ERIC Educational Resources Information Center
Williams, Immanuel James; Williams, Kelley Kim
2016-01-01
Understanding summary statistics and graphical techniques are building blocks to comprehending concepts beyond basic statistics. It's known that motivated students perform better in school. Using examples that students find engaging allows them to understand the concepts at a deeper level.
WE-A-201-01: Memorial Introduction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, C.
Chris Marshall: Memorial Introduction Donald Edmonds Herbert Jr., or Don to his colleagues and friends, exemplified the “big tent” vision of medical physics, specializing in Applied Statistics and Dynamical Systems theory. He saw, more clearly than most, that “Making models is the difference between doing science and just fooling around [ref Woodworth, 2004]”. Don developed an interest in chemistry at school by “reading a book” - a recurring theme in his story. He was awarded a Westinghouse Science scholarship and attended the Carnegie Institute of Technology (later Carnegie Mellon University) where his interest turned to physics and led to amore » BS in Physics after transfer to Northwestern University. After (voluntary) service in the Navy he earned his MS in Physics from the University of Oklahoma, which led him to Johns Hopkins University in Baltimore to pursue a PhD. The early death of his wife led him to take a salaried position in the Physics Department of Colorado College in Colorado Springs so as to better care for their young daughter. There, a chance invitation from Dr. Juan del Regato to teach physics to residents at the Penrose Cancer Hospital introduced him to Medical Physics, and he decided to enter the field. He received his PhD from the University of London (UK) under Prof. Joseph Rotblat, where I first met him, and where he taught himself statistics. He returned to Penrose as a clinical medical physicist, also largely self-taught. In 1975 he formalized an evolving interest in statistical analysis as Professor of Radiology and Head of the Division of Physics and Statistics at the College of Medicine of the University of South Alabama in Mobile, AL where he remained for the rest of his career. He also served as the first Director of their Bio-Statistics and Epidemiology Core Unit working in part on a sickle-cell disease. After retirement he remained active as Professor Emeritus. Don served for several years as a consultant to the Nuclear Regulatory Commission and may be remembered for his critique of the National Academy of Sciences BEIR III report (stating that their methodology “imposes a Delphic quality to the .. risk estimates”.) This led to his appointment as a member of the BEIR V committee. Don presented refresher courses at the AAPM, ASTRO and RSNA meetings and was active in the AAPM as a member or chair of several committees. He was the principal author of AAPM Report 43, which is essentially a critique of established clinical studies prior to 1992. He was co-editor of the Proceedings of many symposia on Time, Dose and Fractionation held in Madison, Wisconsin. He received the AAPM lifetime Achievement award in 2004. Don’s second wife of 46 years, Ann, predeceased him and he is survived by daughters Hillary and Emily, son John and two grandsons. Don was a true gentleman with a unique and erudite writing style illuminated by pithy quotations. If he had a fault it was, perhaps, that he did not realize how much smarter he was than the rest of us. This presentation draws heavily on a biography and video interview in the History and Heritage section of the AAPM website. The quote is his own. Andrzej Niemierko: Statistical modeling plays an essential role in modern medicine for quantitative evaluation of the effect of treatment. This session will feature an overview of statistical modeling techniques used for analyzing the many types of research data and an exploration of recent advances in new statistical modeling methodologies. Learning Objectives: To learn basics of statistical modeling methodology. To discuss statistical models that are frequently used in radiation oncology To discuss advanced modern statistical modeling methods and applications.« less
Cognitive Correlates of Performance in Advanced Mathematics
ERIC Educational Resources Information Center
Wei, Wei; Yuan, Hongbo; Chen, Chuansheng; Zhou, Xinlin
2012-01-01
Background: Much research has been devoted to understanding cognitive correlates of elementary mathematics performance, but little such research has been done for advanced mathematics (e.g., modern algebra, statistics, and mathematical logic).Aims: To promote mathematical knowledge among college students, it is necessary to understand what factors…
Medical subject heading (MeSH) annotations illuminate maize genetics and evolution
USDA-ARS?s Scientific Manuscript database
In the modern era, high-density marker panels and/or whole-genome sequencing,coupled with advanced phenotyping pipelines and sophisticated statistical methods, have dramatically increased our ability to generate lists of candidate genes or regions that are putatively associated with phenotypes or pr...
Modern Prediction Methods for Turbomachine Performance
1976-01-01
easier on the basis of C factor correlation. Generally, correlation has to be carefully established; statistical methods may be of good help when a , large...FLOW TURBOMACHINE BLADES Walker, G. J. Instn of Engrs,, Australia-3rd Australiasian Conference on Hydraulics 8 Fluid Mechanics-Proc, Nov 25-29 1968
Tekelab, Tesfalidet; Melka, Alemu Sufa; Wirtu, Desalegn
2015-07-17
In Ethiopia, the prevalence of modern contraceptive use is very low (27 %) and the percentage of those with unmet needs for family planning is 25 %. The current study identified factors associated with the utilization of modern contraceptive methods among married women in Western Ethiopia. A community based, cross-sectional study was employed from April 10 to April 25, 2014, among married women of reproductive age in Nekemte Town. A multi-stage sampling procedure was used to select 1003 study participants. A pretested structured questionnaire was used to collect data, and data collectors who had completed high school were involved in the data collection process. A bivariate, multivariable logistic regression model was fit, and statistical significance was determined with a 95% confidence level. The overall utilization rate of modern contraceptives in this study was 71.9%. The most common form of modern contraceptives used was injectable (60.3%). Age (AOR = 2.00, 95 % CI = 1.35-2.98), women's educational level (AOR = 2.50, 95 % CI = 1.62-3.84), monthly income (AOR = 2.26, 95 % CI = 1.24-4.10), respondent's fertility (AOR = 2.60, 95 % CI = 1.48-4.56), fertility-related decision (AOR = 3.70, 95 % CI = 2.45-5.58), and having radio (AOR = 1.93, 95 % CI = 1.37-2.71) showed significant positive associations with the utilization of modern contraceptive methods. The findings showed that women's empowerment, fertility-related discussions among couples, and the availability of the media were important factors that influenced the use of modern contraceptives. Thus, policymakers and implementers should work on those factors to increase the utilization of modern contraceptive methods.
Gorgolewski, Krzysztof J; Varoquaux, Gael; Rivera, Gabriel; Schwartz, Yannick; Sochat, Vanessa V; Ghosh, Satrajit S; Maumet, Camille; Nichols, Thomas E; Poline, Jean-Baptiste; Yarkoni, Tal; Margulies, Daniel S; Poldrack, Russell A
2016-01-01
NeuroVault.org is dedicated to storing outputs of analyses in the form of statistical maps, parcellations and atlases, a unique strategy that contrasts with most neuroimaging repositories that store raw acquisition data or stereotaxic coordinates. Such maps are indispensable for performing meta-analyses, validating novel methodology, and deciding on precise outlines for regions of interest (ROIs). NeuroVault is open to maps derived from both healthy and clinical populations, as well as from various imaging modalities (sMRI, fMRI, EEG, MEG, PET, etc.). The repository uses modern web technologies such as interactive web-based visualization, cognitive decoding, and comparison with other maps to provide researchers with efficient, intuitive tools to improve the understanding of their results. Each dataset and map is assigned a permanent Universal Resource Locator (URL), and all of the data is accessible through a REST Application Programming Interface (API). Additionally, the repository supports the NIDM-Results standard and has the ability to parse outputs from popular FSL and SPM software packages to automatically extract relevant metadata. This ease of use, modern web-integration, and pioneering functionality holds promise to improve the workflow for making inferences about and sharing whole-brain statistical maps. Copyright © 2015 Elsevier Inc. All rights reserved.
Clear, Hold, Build: Modern Political Techniques in COIN
2008-01-01
coordinating with groups that are not typically associated with military activities. Key to this coordination is leveraging assets that allow commanders to understand the social relationships in their AO.
Kamal, S M Mostafa
2015-03-01
This article explores the socioeconomic factors affecting contraceptive use and method choice among women of urban slums using the nationally representative 2006 Bangladesh Urban Health Survey. Both bivariate and multivariate statistical analyses were applied to examine the relationship between a set of sociodemographic factors and the dependent variables. Overall, the contraceptive prevalence rate was 58.1%, of which 53.2% were modern methods. Women's age, access to TV, number of unions, nongovernmental organization membership, working status of women, number of living children, child mortality, and wealth index were important determinants of contraceptive use and method preference. Sex composition of surviving children and women's education were the most important determinants of contraceptive use and method choice. Programs should be strengthened to provide nonclinical modern methods free of cost among the slum dwellers. Doorstep delivery services of modern contraceptive methods may raise the contraceptive prevalence rate among the slum dwellers in Bangladesh. © 2011 APJPH.
Invasive and noninvasive dental analgesia techniques.
Estafan, D J
1998-01-01
Although needle-administered local anesthesia has been an essential tool of modern dentistry, it has also been responsible for many patients' fears of dental visits. Several new techniques have recently evolved that may offer viable alternatives. Two of these operate via electronic mechanisms that interfere with pain signals, two others involve transmucosal modes of administration, and a fifth technique involves an intraosseous pathway for anesthesia administration. Each of these techniques has different indications for dental procedures, but none is intended to replace needle administration in dentistry. This overview highlights the salient features of these alternative dental anesthesia techniques.
Gillis, Richard B; Rowe, Arthur J; Adams, Gary G; Harding, Stephen E
2014-10-01
This short review considers the range of modern techniques for the hydrodynamic characterisation of macromolecules - particularly large glycosylated systems used in the food, biopharma and healthcare industries. The range or polydispersity of molecular weights and conformations presents special challenges compared to proteins. The review is aimed, without going into any great theoretical or methodological depth, to help the Industrial Biotechnologist choose the appropriate methodology or combination of methodologies for providing the detail he/she needs for particular applications.
Understanding Radiation Thermometry. Part II
NASA Technical Reports Server (NTRS)
Risch, Timothy K.
2015-01-01
This document is a two-part course on the theory and practice of radiation thermometry. Radiation thermometry is the technique for determining the temperature of a surface or a volume by measuring the electromagnetic radiation it emits. This course covers the theory and practice of radiative thermometry and emphasizes the modern application of the field using commercially available electronic detectors and optical components. The course covers the historical development of the field, the fundamental physics of radiative surfaces, along with modern measurement methods and equipment.
Understanding Radiation Thermometry. Part I
NASA Technical Reports Server (NTRS)
Risch Timothy K.
2015-01-01
This document is a two-part course on the theory and practice of radiation thermometry. Radiation thermometry is the technique for determining the temperature of a surface or a volume by measuring the electromagnetic radiation it emits. This course covers the theory and practice of radiative thermometry and emphasizes the modern application of the field using commercially available electronic detectors and optical components. The course covers the historical development of the field, the fundamental physics of radiative surfaces, along with modern measurement methods and equipment.
Introduction to Modern Methods in Light Microscopy.
Ryan, Joel; Gerhold, Abby R; Boudreau, Vincent; Smith, Lydia; Maddox, Paul S
2017-01-01
For centuries, light microscopy has been a key method in biological research, from the early work of Robert Hooke describing biological organisms as cells, to the latest in live-cell and single-molecule systems. Here, we introduce some of the key concepts related to the development and implementation of modern microscopy techniques. We briefly discuss the basics of optics in the microscope, super-resolution imaging, quantitative image analysis, live-cell imaging, and provide an outlook on active research areas pertaining to light microscopy.
London, Douglas S; Stoll, Andrew L; Manning, Bruce B
2006-01-01
Modernization of agricultural systems to increase output causes changes to the nutritional content of food entire populations consume. Human nutritional needs differ from their "food", thus producing healthy agricultural products is not equivalent to providing agricultural products that are healthy for humans. Inclusion of the food production system as a factor in the increase of neuropsychiatric disorders and other chronic diseases helps explain negative trends in modern chronic diseases that remain unchecked despite stunning advances in modern medicine. Diseases in which our own technology plays a significant role include obesity and resulting disorders, such as diabetes, heart disease, hypertension, stroke and arthritis. Modernization's lure leads to importation of modern agricultural practices into a nutritionally vulnerable, malnourished and sometimes starving developing world. Wealthier nations hedge their food portfolio by having access to a wider variety of foods. The developing world's reliance on staple foods means even a minor widespread nutritional modification of one key food can have profound effects. New agricultural techniques may improve or exacerbate neuropsychiatric disorders through nutritional modification in regions where populations walk a nutritional tightrope with little margin for error. In most of the developing world western psychiatric interventions have failed to make inroads. People's consumption of fish has a demonstrated beneficial effect on their mental health and the omega-3 fatty acid content is a significant factor. Epidemiological, biological and agricultural studies implicate a lack of dietary omega-3s as a factor in certain mental disorders. Replenishing omega-3s has improved mental illnesses in controlled clinical trials. This article's detailed tilapia fish-farming model demonstrates how aquaculture/agriculture techniques can function as a public health intervention by increasing dietary omega-3s through creation of sustainable, economical and culturally appropriate food sources for the developing world.
Karatasakis, Aris; Brilakis, Emmanouil S
2017-11-01
Antegrade and retrograde dissection/re-entry techniques are frequently utilized in contemporary CTO PCI, especially for complex lesions. One-year outcomes with modern dissection/re-entry techniques appear favorable and comparable with those achieved after intraplaque crossing, supporting their increased use. Randomized data on the procedural safety, efficiency, and long-term outcomes of subadventitial CTO PCI techniques are needed. © 2017 Wiley Periodicals, Inc.
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
The Hierarchical Personality Structure of Aspiring Creative Writers
ERIC Educational Resources Information Center
Maslej, Marta M.; Rain, Marina; Fong, Katrina; Oatley, Keith; Mar, Raymond A.
2014-01-01
Empirical studies of personality traits in creative writers have demonstrated mixed findings, perhaps due to issues of sampling, measurement, and the reporting of statistical information. The goal of this study is to quantify the personality structure of aspiring creative writers according to a modern hierarchal model of trait personality. A…
Information Transparency in Education: Three Sides of a Two-Sided Process
ERIC Educational Resources Information Center
Mertsalova, T. A.
2015-01-01
Information transparency is the result of informational globalization and the avalanche of information and communication technologies: thus, these processes are natural for the whole modern society. Statistics show that during the past several years the transparency situation not just in education but in the entire society has expanded…
A Comparison of the Effects of Non-Normal Distributions on Tests of Equivalence
ERIC Educational Resources Information Center
Ellington, Linda F.
2011-01-01
Statistical theory and its application provide the foundation to modern systematic inquiry in the behavioral, physical and social sciences disciplines (Fisher, 1958; Wilcox, 1996). It provides the tools for scholars and researchers to operationalize constructs, describe populations, and measure and interpret the relations between populations and…
Effect-Size Measures and Meta-Analytic Thinking in Counseling Psychology Research
ERIC Educational Resources Information Center
Henson, Robin K.
2006-01-01
Effect sizes are critical to result interpretation and synthesis across studies. Although statistical significance testing has historically dominated the determination of result importance, modern views emphasize the role of effect sizes and confidence intervals. This article accessibly discusses how to calculate and interpret the effect sizes…
Bridging Ends to Means: Achieving a Viable Peace in Afghanistan
2010-04-01
the Global War on Terror. No price should be too high to guarantee national security; yet, as the statistics suggest, the fiscal situation has...Studies AY10 Coursebook , edited by Sharon McBride, 351-356. Maxwell AFB, AL: Air University Press, October 2009. Saikal, Amin. Modern Afghanistan
Exclusion, Civic Invisibility and Impunity as Explanations for Youth Murders in Brazil.
ERIC Educational Resources Information Center
Huggins, Martha K.; DeCastro, Myriam Mesquita P.
1996-01-01
Examines youth murders in Brazil, including victim-generating sociostructural situations and creation of victims. Hypothesizes that modern Brazilian social structures shape poor Brazilian youth's vulnerability to murder by strangers. Presents statistics dealing with the gender distributions, age, skin color, and mode of death, identifying…
SMART-DS: Synthetic Models for Advanced, Realistic Testing: Distribution
statistical summary of the U.S. distribution systems World-class, high spatial/temporal resolution of solar Systems and Scenarios | Grid Modernization | NREL SMART-DS: Synthetic Models for Advanced , Realistic Testing: Distribution Systems and Scenarios SMART-DS: Synthetic Models for Advanced, Realistic
Influences of environment and disturbance on forest patterns in coastal Oregon watersheds.
Michael C. Wimberly; Thomas A. Spies
2001-01-01
Modern ecology often emphasizes the distinction between traditional theories of stable, environmentally structured communities and a new paradigm of disturbance driven, nonequilibrium dynamics. However, multiple hypotheses for observed vegetation patterns have seldom been explicitly tested. We used multivariate statistics and variation partitioning methods to assess...
Re-Conceptualizing the Past: Historical Data in Vocational Interest Research
ERIC Educational Resources Information Center
Armstrong, Patrick Ian; Rounds, James; Hubert, Lawrence
2008-01-01
Noteworthy progress has been made in the development of statistical models for evaluating the structure of vocational interests over the past three decades. It is proposed that historically significant interest datasets, when combined with modern structural methods of data analysis, provide an opportunity to re-examine the underlying assumptions…
Statistics and Politics in a "Knowledge Society"
ERIC Educational Resources Information Center
Giovannini, Enrico
2008-01-01
The importance of information in economic and political processes is widely recognised by modern theories. This information, coupled with the advancements in Information and Communication Technologies (ICT) has changed the way in which markets and societies work. The availability of the Internet and other advanced forms of media have made…
Comparing the eyes depicted in Japanese portraits of beautiful women: the Meiji and modern periods.
Lee, James J; Thomas, Ewart
2012-06-01
The women portrayed in the bijin-ga of the past, particularly those from the Meiji Period (1868–1912), tended not to show much resemblance to those of women portrayed in the more modern bijin-ga (from after World War II), and such an observation came across as a possible indication that Japanese standards of beauty have changed over the two eras. To examine whether the apparent discrepancy can be interpreted as an actual change in the standards or not, a study was designed with the aim of assigning numeric values to several aspects of the eyes and testing for the presence of a statistically significant difference in each of the aspects between the Meiji bijin-ga and the modern bijin-ga. For this study, 29 Meiji bijin-ga and 36 modern bijin-ga were selected. The eye was chosen as the subject of comparison, and five aspects were categorized and measured: (1) presence or absence of a double fold, (2) eye width, (3) eye height, (4) eyebrow-to-upper lid distance, and (D) corneal diameter. The eye width, the eye height, and the eyebrow-to-upper lid distance were divided by the corneal diameter to derive standardized grounds for comparison. The difference in double-fold frequencies between the Meiji bijin-ga (24%) and the modern bijin-ga (36%) was not found to be statistically significant (p=0.298). There was no difference in the eye width-to-corneal diameter ratio between the Meiji bijin-ga (mean 2.57±0.6) and the modern bijin-ga (mean 2.61±0.85) (p=0.86). The eye height-to-corneal diameter ratio derived from the Meiji bijin-ga (mean 0.62±0.15) was significantly smaller than that derived from the modern bijin-ga (mean 0.82±0.18) (p=0.000). The eyebrow to upper lid distance-to-corneal diameter ratio derived from the Meiji bijin-ga (mean 2.21±0.83) was significantly greater than that derived from the modern bijin-ga (mean 1.36±0.78) (p=0.000). The results of the study support the notion that Westernization contributed to bringing about changes in the Japanese standards of beautiful eyes in the context of bijin-ga. However, the fact that the changeover has not occurred in all the categories in question does not indicate that the Occidental characteristics came to be emulated in their entirety. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors at http://www.springer.com/00266.
The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques
ERIC Educational Resources Information Center
Menil, Violeta C.
2005-01-01
In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…
Pain and Surgery in England, circa 1620–circa 1740
Walker, Katherine A.
2015-01-01
The scholarship on the discussion and role of pain in early modern English surgery is limited. Scholars have given little consideration to how surgeons described and comprehended pain in their patients’ bodies in early modern England, including how these understandings connected to notions of the humours, nerves and sex difference. This article focuses on the attention that surgeons paid to pain in their published and manuscript casebooks and manuals available in English, circa 1620–circa 1740. Pain was an important component of surgery in early modern England, influencing diagnosis, treatment and technique. Surgeons portrayed a complex and multi-dimensional understanding of their patients’ bodies in pain, which was further connected to their portrayals of their professional ability. PMID:25766543
NASA Astrophysics Data System (ADS)
Marrugo, Andrés G.; Millán, María S.; Cristóbal, Gabriel; Gabarda, Salvador; Sorel, Michal; Sroubek, Filip
2012-06-01
Medical digital imaging has become a key element of modern health care procedures. It provides visual documentation and a permanent record for the patients, and most important the ability to extract information about many diseases. Modern ophthalmology thrives and develops on the advances in digital imaging and computing power. In this work we present an overview of recent image processing techniques proposed by the authors in the area of digital eye fundus photography. Our applications range from retinal image quality assessment to image restoration via blind deconvolution and visualization of structural changes in time between patient visits. All proposed within a framework for improving and assisting the medical practice and the forthcoming scenario of the information chain in telemedicine.
ERIC Educational Resources Information Center
Rosado, Dale A., Jr.; Masterson, Tina S.; Masterson, Douglas S.
2011-01-01
Mass spectrometry (MS) has been gaining in popularity in recent years owing in large part to the development of soft-ionization techniques such as matrix-assisted laser desorption ionization (MALDI) and electrospray ionization (ESI). These soft-ionization techniques have opened up the field of MS analysis to biomolecules, polymers, and other high…
Designing Modern Dance Classes for the Mature Mover: Physiological and Psychological Considerations
ERIC Educational Resources Information Center
Brodie, Julie A.; Lobel, Elin E.
2016-01-01
Dancers are continuing to dance longer due to changes in technique and increased awareness of the body and safe movement practices. Even after performing careers have ended, it is healthy both physically and emotionally for dancers to continue to take technique class, particularly if they are teaching dance classes. It can be a challenge, however,…
ERIC Educational Resources Information Center
Whited, Matthew T.; Hofmeister, Gretchen E.
2014-01-01
Experiments are described for the reliable small-scale glovebox preparation of CpMo(CO)[subscript 3](CH[subscript 3]) and acetyl derivatives thereof through phosphine-induced migratory insertion. The robust syntheses introduce students to a variety of organometallic reaction mechanisms and glovebox techniques, and they are easily carried out…
Modern Display Technologies for Airborne Applications.
1983-04-01
the case of LED head-down direct view displays, this requires that special attention be paid to the optical filtering , the electrical drive/address...effectively attenuates the LED specular reflectance component, the colour and neutral density filtering attentuate the diffuse component and the... filter techniques are planned for use with video, multi- colour and advanced versions of numeric, alphanumeric and graphic displays; this technique
InterPlay: A Tool for Cultivating Expression in Technique Class
ERIC Educational Resources Information Center
Carlson, Sarah
2013-01-01
In her experience teaching modern dance at a range of institutions, the author has noticed that even as many students exhibit superior physical skill in technique class, most are lacking when it comes to expression. From large university BFA to smaller liberal arts programs, she finds that her students often fall into a land of physical imitation,…
New methods and materials for molding and casting ice formations
NASA Technical Reports Server (NTRS)
Reehorst, Andrew L.; Richter, G. Paul
1987-01-01
This study was designed to find improved materials and techniques for molding and casting natural or simulated ice shapes that could replace the wax and plaster method. By utilizing modern molding and casting materials and techniques, a new methodology was developed that provides excellent reproduction, low-temperature capability, and reasonable turnaround time. The resulting casts are accurate and tough.
Vehicle registration compliance in Wisconsin : [summary].
DOT National Transportation Integrated Search
2015-03-01
The Wisconsin Department of Transportation (WisDOT) conducted an investigation : to improve its passenger vehicle registration processes, with the goals to modernize : techniques, reduce costs, enhance security and maximize compliance. WisDOTs : D...
Should I Pack My Umbrella? Clinical versus Statistical Prediction of Mental Health Decisions
ERIC Educational Resources Information Center
Aegisdottir, Stefania; Spengler, Paul M.; White, Michael J.
2006-01-01
In this rejoinder, the authors respond to the insightful commentary of Strohmer and Arm, Chwalisz, and Hilton, Harris, and Rice about the meta-analysis on statistical versus clinical prediction techniques for mental health judgments. The authors address issues including the availability of statistical prediction techniques for real-life psychology…
Change Detection in Rough Time Series
2014-09-01
Business Statistics : An Inferential Approach, Dellen: San Francisco. [18] Winston, W. (1997) Operations Research Applications and Algorithms, Duxbury...distribution that can present significant challenges to conventional statistical tracking techniques. To address this problem the proposed method...applies hybrid fuzzy statistical techniques to series granules instead of to individual measures. Three examples demonstrated the robust nature of the
Enhancing Students' Ability to Use Statistical Reasoning with Everyday Problems
ERIC Educational Resources Information Center
Lawson, Timothy J.; Schwiers, Michael; Doellman, Maureen; Grady, Greg; Kelnhofer, Robert
2003-01-01
We discuss a technique for teaching students everyday applications of statistical concepts. We used this technique with students (n = 50) enrolled in several sections of an introductory statistics course; students (n = 45) in other sections served as a comparison group. A class of introductory psychology students (n = 24) served as a second…
Foodomics: MS-based strategies in modern food science and nutrition.
Herrero, Miguel; Simó, Carolina; García-Cañas, Virginia; Ibáñez, Elena; Cifuentes, Alejandro
2012-01-01
Modern research in food science and nutrition is moving from classical methodologies to advanced analytical strategies in which MS-based techniques play a crucial role. In this context, Foodomics has been recently defined as a new discipline that studies food and nutrition domains through the application of advanced omics technologies in which MS techniques are considered indispensable. Applications of Foodomics include the genomic, transcriptomic, proteomic, and/or metabolomic study of foods for compound profiling, authenticity, and/or biomarker-detection related to food quality or safety; the development of new transgenic foods, food contaminants, and whole toxicity studies; new investigations on food bioactivity, food effects on human health, etc. This review work does not intend to provide an exhaustive revision of the many works published so far on food analysis using MS techniques. The aim of the present work is to provide an overview of the different MS-based strategies that have been (or can be) applied in the new field of Foodomics, discussing their advantages and drawbacks. Besides, some ideas about the foreseen development and applications of MS-techniques in this new discipline are also provided. Copyright © 2011 Wiley Periodicals, Inc.
The Health of the American Slave Examined by Means of Union Army Medical Statistics
Freemon, Frank R.
1985-01-01
The health status of the American slave in the 19th century remains unclear despite extensive historical research. Better knowledge of slave health would provide a clearer picture of the life of the slave, a better understanding of the 19th-century medicine, and possibly even clues to the health problems of modern blacks. This article hopes to contribute to the literature by examining another source of data. Slaves entering the Union Army joined an organization with standardized medical care that generated extensive statistical information. Review of these statistics answers questions about the health of young male blacks at the time American slavery ended. PMID:3881595
The limits of protein sequence comparison?
Pearson, William R; Sierk, Michael L
2010-01-01
Modern sequence alignment algorithms are used routinely to identify homologous proteins, proteins that share a common ancestor. Homologous proteins always share similar structures and often have similar functions. Over the past 20 years, sequence comparison has become both more sensitive, largely because of profile-based methods, and more reliable, because of more accurate statistical estimates. As sequence and structure databases become larger, and comparison methods become more powerful, reliable statistical estimates will become even more important for distinguishing similarities that are due to homology from those that are due to analogy (convergence). The newest sequence alignment methods are more sensitive than older methods, but more accurate statistical estimates are needed for their full power to be realized. PMID:15919194
Rasteiro, Rita; Chikhi, Lounès
2013-01-01
The arrival of agriculture into Europe during the Neolithic transition brought a significant shift in human lifestyle and subsistence. However, the conditions under which the spread of the new culture and technologies occurred are still debated. Similarly, the roles played by women and men during the Neolithic transition are not well understood, probably due to the fact that mitochondrial DNA (mtDNA) and Y chromosome (NRY) data are usually studied independently rather than within the same statistical framework. Here, we applied an integrative approach, using different model-based inferential techniques, to analyse published datasets from contemporary and ancient European populations. By integrating mtDNA and NRY data into the same admixture approach, we show that both males and females underwent the same admixture history and both support the demic diffusion model of Ammerman and Cavalli-Sforza. Similarly, the patterns of genetic diversity found in extant and ancient populations demonstrate that both modern and ancient mtDNA support the demic diffusion model. They also show that population structure and differential growth between farmers and hunter-gatherers are necessary to explain both types of data. However, we also found some differences between male and female markers, suggesting that the female effective population size was larger than that of the males, probably due to different demographic histories. We argue that these differences are most probably related to the various shifts in cultural practices and lifestyles that followed the Neolithic Transition, such as sedentism, the shift from polygyny to monogamy or the increase of patrilocality. PMID:23613761
Rasteiro, Rita; Chikhi, Lounès
2013-01-01
The arrival of agriculture into Europe during the Neolithic transition brought a significant shift in human lifestyle and subsistence. However, the conditions under which the spread of the new culture and technologies occurred are still debated. Similarly, the roles played by women and men during the Neolithic transition are not well understood, probably due to the fact that mitochondrial DNA (mtDNA) and Y chromosome (NRY) data are usually studied independently rather than within the same statistical framework. Here, we applied an integrative approach, using different model-based inferential techniques, to analyse published datasets from contemporary and ancient European populations. By integrating mtDNA and NRY data into the same admixture approach, we show that both males and females underwent the same admixture history and both support the demic diffusion model of Ammerman and Cavalli-Sforza. Similarly, the patterns of genetic diversity found in extant and ancient populations demonstrate that both modern and ancient mtDNA support the demic diffusion model. They also show that population structure and differential growth between farmers and hunter-gatherers are necessary to explain both types of data. However, we also found some differences between male and female markers, suggesting that the female effective population size was larger than that of the males, probably due to different demographic histories. We argue that these differences are most probably related to the various shifts in cultural practices and lifestyles that followed the Neolithic Transition, such as sedentism, the shift from polygyny to monogamy or the increase of patrilocality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, Lena, E-mail: lena.specht@regionh.dk; Yahalom, Joachim; Illidge, Tim
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solelymore » on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines.« less
Specht, Lena; Yahalom, Joachim; Illidge, Tim; Berthelsen, Anne Kiil; Constine, Louis S; Eich, Hans Theodor; Girinsky, Theodore; Hoppe, Richard T; Mauch, Peter; Mikhaeel, N George; Ng, Andrea
2014-07-15
Radiation therapy (RT) is the most effective single modality for local control of Hodgkin lymphoma (HL) and an important component of therapy for many patients. These guidelines have been developed to address the use of RT in HL in the modern era of combined modality treatment. The role of reduced volumes and doses is addressed, integrating modern imaging with 3-dimensional (3D) planning and advanced techniques of treatment delivery. The previously applied extended field (EF) and original involved field (IF) techniques, which treated larger volumes based on nodal stations, have now been replaced by the use of limited volumes, based solely on detectable nodal (and extranodal extension) involvement at presentation, using contrast-enhanced computed tomography, positron emission tomography/computed tomography, magnetic resonance imaging, or a combination of these techniques. The International Commission on Radiation Units and Measurements concepts of gross tumor volume, clinical target volume, internal target volume, and planning target volume are used for defining the targeted volumes. Newer treatment techniques, including intensity modulated radiation therapy, breath-hold, image guided radiation therapy, and 4-dimensional imaging, should be implemented when their use is expected to decrease significantly the risk for normal tissue damage while still achieving the primary goal of local tumor control. The highly conformal involved node radiation therapy (INRT), recently introduced for patients for whom optimal imaging is available, is explained. A new concept, involved site radiation therapy (ISRT), is introduced as the standard conformal therapy for the scenario, commonly encountered, wherein optimal imaging is not available. There is increasing evidence that RT doses used in the past are higher than necessary for disease control in this era of combined modality therapy. The use of INRT and of lower doses in early-stage HL is supported by available data. Although the use of ISRT has not yet been validated in a formal study, it is more conservative than INRT, accounting for suboptimal information and appropriately designed for safe local disease control. The goal of modern smaller field radiation therapy is to reduce both treatment volume and treatment dose while maintaining efficacy and minimizing acute and late sequelae. This review is a consensus of the International Lymphoma Radiation Oncology Group (ILROG) Steering Committee regarding the modern approach to RT in the treatment of HL, outlining a new concept of ISRT in which reduced treatment volumes are planned for the effective control of involved sites of HL. Nodal and extranodal non-Hodgkin lymphomas (NHL) are covered separately by ILROG guidelines. Copyright © 2014 Elsevier Inc. All rights reserved.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Putting engineering back into protein engineering: bioinformatic approaches to catalyst design.
Gustafsson, Claes; Govindarajan, Sridhar; Minshull, Jeremy
2003-08-01
Complex multivariate engineering problems are commonplace and not unique to protein engineering. Mathematical and data-mining tools developed in other fields of engineering have now been applied to analyze sequence-activity relationships of peptides and proteins and to assist in the design of proteins and peptides with specified properties. Decreasing costs of DNA sequencing in conjunction with methods to quickly synthesize statistically representative sets of proteins allow modern heuristic statistics to be applied to protein engineering. This provides an alternative approach to expensive assays or unreliable high-throughput surrogate screens.
Dark energy models through nonextensive Tsallis' statistics
NASA Astrophysics Data System (ADS)
Barboza, Edésio M.; Nunes, Rafael da C.; Abreu, Everton M. C.; Ananias Neto, Jorge
2015-10-01
The accelerated expansion of the Universe is one of the greatest challenges of modern physics. One candidate to explain this phenomenon is a new field called dark energy. In this work we have used the Tsallis nonextensive statistical formulation of the Friedmann equation to explore the Barboza-Alcaniz and Chevalier-Polarski-Linder parametric dark energy models and the Wang-Meng and Dalal vacuum decay models. After that, we have discussed the observational tests and the constraints concerning the Tsallis nonextensive parameter. Finally, we have described the dark energy physics through the role of the q-parameter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lima, F.W.; Pagano, C.; Schneiderman, B.
1959-07-01
Boron can be determined quantitatively by absorption spectrophotometry of solutions of the red compound formed by the reaction of boric acid with curcumin. This reaction is affected by various factors, some of which can be detected easily in the data interpretation. Others, however, provide more difficulty. The application of modern statistical method to the study of the influence of these factors on the quantitative determination of boron is presented. These methods provide objective ways of establishing significant effects of the factors involved. (auth)
Simultenious binary hash and features learning for image retrieval
NASA Astrophysics Data System (ADS)
Frantc, V. A.; Makov, S. V.; Voronin, V. V.; Marchuk, V. I.; Semenishchev, E. A.; Egiazarian, K. O.; Agaian, S.
2016-05-01
Content-based image retrieval systems have plenty of applications in modern world. The most important one is the image search by query image or by semantic description. Approaches to this problem are employed in personal photo-collection management systems, web-scale image search engines, medical systems, etc. Automatic analysis of large unlabeled image datasets is virtually impossible without satisfactory image-retrieval technique. It's the main reason why this kind of automatic image processing has attracted so much attention during recent years. Despite rather huge progress in the field, semantically meaningful image retrieval still remains a challenging task. The main issue here is the demand to provide reliable results in short amount of time. This paper addresses the problem by novel technique for simultaneous learning of global image features and binary hash codes. Our approach provide mapping of pixel-based image representation to hash-value space simultaneously trying to save as much of semantic image content as possible. We use deep learning methodology to generate image description with properties of similarity preservation and statistical independence. The main advantage of our approach in contrast to existing is ability to fine-tune retrieval procedure for very specific application which allow us to provide better results in comparison to general techniques. Presented in the paper framework for data- dependent image hashing is based on use two different kinds of neural networks: convolutional neural networks for image description and autoencoder for feature to hash space mapping. Experimental results confirmed that our approach has shown promising results in compare to other state-of-the-art methods.
Postpartum modern contraceptive use in northern Ethiopia: prevalence and associated factors
Teferra, Alemayehu Shimeka; Gelagay, Abebaw Addis
2017-01-01
OBJECTIVES The postpartum period is a critical period for addressing widespread unmet needs in family planning and for reducing the risks of closely spaced pregnancies. However, contraception during the extended postpartum period has been underemphasized in Ethiopia. Therefore, this study aimed to assess postpartum modern contraceptive use among women in northern Ethiopia and to identify factors associated with modern contraceptive use in the postpartum period. METHODS A community based cross-sectional study was conducted from March to April, 2015. Data were entered using Epi Info version 7 and then exported into Stata version 12 for analysis. Bivariate and multivariate logistic regression models were fitted to identify the determinants of postpartum modern contraceptive use. Adjusted odds ratios (aORs) with 95% confidence intervals (CIs) were calculated, and p-values <0.05 were considered to indicate statistical significance. RESULTS Nearly half (48.0%) of women used modern contraceptives during the extended postpartum period. Postpartum modern contraceptive use was significantly associated with secondary and tertiary education levels (aOR, 4.25; 95% CI, 1.29 to 14.00; aOR, 5.36 ; 95% CI, 1.14 to 25.45, respectively), family planning counseling during prenatal and postnatal care (aOR, 5.72 ; 95% CI, 2.67, 12.28), having postnatal care (aOR, 2.36; 95% CI, 1.15 to 4.87), resuming sexual activity (aOR, 9.53; 95% CI, 3.74 to 24.27), and menses returning after birth (aOR, 6.35; 95% CI, 3.14 to 13.39). In addition, experiencing problems with previous contraceptive use was negatively associated with modern contraceptive use (aOR, 0.34; 95% CI, 0.16 to 0.72). CONCLUSIONS Low rate of postpartum modern contraceptive use were found in the study area. Therefore, strengthening family planning counseling during antenatal and postnatal care visits, improving utilization of postnatal care services and improving women’s educational status are crucial steps for to enhance modern contraceptive use among postpartum women. PMID:28330336
Family Life and Developmental Idealism in Yazd, Iran
Abbasi-Shavazi, Mohammad Jalal; Askari-Nodoushan, Abbas
2012-01-01
BACKGROUND This paper is motivated by the theory that developmental idealism has been disseminated globally and has become an international force for family and demographic change. Developmental idealism is a set of cultural beliefs and values about development and how development relates to family and demographic behavior. It holds that modern societies are causal forces producing modern families, that modern families help to produce modern societies, and that modern family change is to be expected. OBJECTIVE We examine the extent to which developmental idealism has been disseminated in Iran. We also investigate predictors of the dissemination of developmental idealism. METHODS We use survey data collected in 2007 from a sample of women in Yazd, a city in Iran. We examine the distribution of developmental idealism in the sample and the multivariate predictors of developmental idealism. RESULTS We find considerable support for the expectation that many elements of developmental idealism have been widely disseminated. Statistically significant majorities associate development with particular family attributes, believe that development causes change in families, believe that fertility reductions and age-at-marriage increases help foster development, and perceive family trends in Iran headed toward modernity. As predicted, parental education, respondent education, and income affect adherence to developmental idealism. CONCLUSIONS Developmental idealism has been widely disseminated in Yazd, Iran and is related to social and demographic factors in predicted ways. COMMENTS Although our data come from only one city, we expect that developmental idealism has been widely distributed in Iran, with important implications for family and demographic behavior. PMID:22942772
Biometric Analysis - A Reliable Indicator for Diagnosing Taurodontism using Panoramic Radiographs.
Hegde, Veda; Anegundi, Rajesh Trayambhak; Pravinchandra, K R
2013-08-01
Taurodontism is a clinical entity with a morpho-anatomical change in the shape of the tooth, which was thought to be absent in modern man. Taurodontism is mostly observed as an isolated trait or a component of a syndrome. Various techniques have been devised to diagnose taurodontism. The aim of this study was to analyze whether a biometric analysis was useful in diagnosing taurodontism, in radiographs which appeared to be normal on cursory observations. This study was carried out in our institution by using radiographs which were taken for routine procedures. In this retrospective study, panoramic radiographs were obtained from dental records of children who were aged between 9-14 years, who did not have any abnormality on cursory observations. Biometric analyses were carried out on permanent mandibular first molar(s) by using a novel biometric method. The values were tabulated and analysed. Fischer exact probability test, Chi square test and Chi-square test with Yates correction were used for statistical analysis of the data. Cursory observation did not yield us any case of taurodontism. In contrast, the biometric analysis yielded us a statistically significant number of cases of taurodontism. However, there was no statistically significant difference in the number of cases with taurodontism, which was obtained between the genders and the age group which was considered. Thus, taurodontism was diagnosed on a biometric analysis, which was otherwise missed on a cursory observation. It is therefore necessary from the clinical point of view, to diagnose even the mildest form of taurodontism by using metric analysis rather than just relying on a visual radiographic assessment, as its occurrence has many clinical implications and a diagnostic importance.
NASA Technical Reports Server (NTRS)
Colwell, R. N.
1973-01-01
Since May 1970, personnel on several campuses of the University of California have been conducting investigations which seek to determine the usefulness of modern remote sensing techniques for studying various components of California's earth resources complex. Emphasis has been given to California's water resources as exemplified by the Feather River project and other aspects of the California Water Plan. This study is designed to consider in detail the supply, demand, and impact relationships. The specific geographic areas studied are the Feather River drainage in northern California, the Chino-Riverside Basin and Imperial Valley areas in southern California, and selected portions of the west side of San Joaquin Valley in central California. An analysis is also given on how an effective benefit-cost study of remote sensing in relation to California's water resources might best be made.
A solution for future designs using techniques from vernacular architecture in southern Iran
NASA Astrophysics Data System (ADS)
Mirahmadi, Fatima; Altan, Hasim
2018-02-01
Nowadays in modern life, every technology and technique for comfortable life is available. People with low income, in other words, with low levels of economic power, can also have those facilities to stay warm in winter and stay cool in summer. Many years back when there were no advanced systems for human needs, passive strategies played a big role in peoples' lives. This paper concentrates on a small city in Iran that had used special strategies to solve peoples' environmental issues. The city is called Evaz, which is located in the Fars region of Iran with distance around 20 km from Gerash city and 370 km from south east of Shiraz. Evaz receives minimum rainfall, which is the reason why water is limited in this area and therefore, cisterns (water storage) had been used for many years that is studied in more detail in this paper. In summers, the climate is hot and dry, sometimes the external temperatures reaching around 46 °C during the day. Although the winters are typically cold and likewise dry, moderate climate is available in Evaz during autumn and spring. This study identifies some of the past strategies and describes them in detail with analysis for transformation and connections with the modern and traditional fundamentals. Furthermore, the study develops some solutions utilizing a combination of both modern and traditional techniques in design to suggest better and more effective ways to save energy, and at the same time to remain sustainable for the future.
Statistics for Learning Genetics
NASA Astrophysics Data System (ADS)
Charles, Abigail Sheena
This study investigated the knowledge and skills that biology students may need to help them understand statistics/mathematics as it applies to genetics. The data are based on analyses of current representative genetics texts, practicing genetics professors' perspectives, and more directly, students' perceptions of, and performance in, doing statistically-based genetics problems. This issue is at the emerging edge of modern college-level genetics instruction, and this study attempts to identify key theoretical components for creating a specialized biological statistics curriculum. The goal of this curriculum will be to prepare biology students with the skills for assimilating quantitatively-based genetic processes, increasingly at the forefront of modern genetics. To fulfill this, two college level classes at two universities were surveyed. One university was located in the northeastern US and the other in the West Indies. There was a sample size of 42 students and a supplementary interview was administered to a select 9 students. Interviews were also administered to professors in the field in order to gain insight into the teaching of statistics in genetics. Key findings indicated that students had very little to no background in statistics (55%). Although students did perform well on exams with 60% of the population receiving an A or B grade, 77% of them did not offer good explanations on a probability question associated with the normal distribution provided in the survey. The scope and presentation of the applicable statistics/mathematics in some of the most used textbooks in genetics teaching, as well as genetics syllabi used by instructors do not help the issue. It was found that the text books, often times, either did not give effective explanations for students, or completely left out certain topics. The omission of certain statistical/mathematical oriented topics was seen to be also true with the genetics syllabi reviewed for this study. Nonetheless, although the necessity for infusing these quantitative subjects with genetics and, overall, the biological sciences is growing (topics including synthetic biology, molecular systems biology and phylogenetics) there remains little time in the semester to be dedicated to the consolidation of learning and understanding.
Forecasting space weather: Can new econometric methods improve accuracy?
NASA Astrophysics Data System (ADS)
Reikard, Gordon
2011-06-01
Space weather forecasts are currently used in areas ranging from navigation and communication to electric power system operations. The relevant forecast horizons can range from as little as 24 h to several days. This paper analyzes the predictability of two major space weather measures using new time series methods, many of them derived from econometrics. The data sets are the A p geomagnetic index and the solar radio flux at 10.7 cm. The methods tested include nonlinear regressions, neural networks, frequency domain algorithms, GARCH models (which utilize the residual variance), state transition models, and models that combine elements of several techniques. While combined models are complex, they can be programmed using modern statistical software. The data frequency is daily, and forecasting experiments are run over horizons ranging from 1 to 7 days. Two major conclusions stand out. First, the frequency domain method forecasts the A p index more accurately than any time domain model, including both regressions and neural networks. This finding is very robust, and holds for all forecast horizons. Combining the frequency domain method with other techniques yields a further small improvement in accuracy. Second, the neural network forecasts the solar flux more accurately than any other method, although at short horizons (2 days or less) the regression and net yield similar results. The neural net does best when it includes measures of the long-term component in the data.
Searching cause of death through different autopsy methods: A new initiative
Das, Abhishek; Chowdhury, Ranadip
2017-01-01
A lawful disposal of human dead body is only possible after establishment of proper and valid cause of death. If the cause is obscure, autopsy is the only mean of search. Inadequacy and unavailability of health care facility often makes this situation more complicated in developing countries where many deaths remain unexplained and proper mortality statistics is missing, especially for infant and children. Tissue sampling by needle autopsy or use of various imaging technique in virtopsy have been tried globally to find out an easier alternative. An exclusive and unique initiative, by limited autopsy through tissue biopsy and body fluid analysis, has been taken to meet this dire need in African and some of Asian developing countries, as worldwide accepted institutional data are even missing or conflicting at times. Traditional autopsy has changed little in last century, consisting of external examination and evisceration, dissection of organs with identification of macroscopic pathologies and injuries, followed by histopathology. As some population groups have religious objections to autopsy, demand for minimally invasive alternative has increased of late. But assessment of cause of death is most important for medico-legal, epidemiological and research purposes. Thus minimally invasive technique is of high importance in primary care settings too. In this article, we have made a journey through different autopsy methods, their relevance and applicability in modern day perspective considering scientific research articles, textbooks and interviews. PMID:29302514
Interactive Sound Propagation using Precomputation and Statistical Approximations
NASA Astrophysics Data System (ADS)
Antani, Lakulish
Acoustic phenomena such as early reflections, diffraction, and reverberation have been shown to improve the user experience in interactive virtual environments and video games. These effects arise due to repeated interactions between sound waves and objects in the environment. In interactive applications, these effects must be simulated within a prescribed time budget. We present two complementary approaches for computing such acoustic effects in real time, with plausible variation in the sound field throughout the scene. The first approach, Precomputed Acoustic Radiance Transfer, precomputes a matrix that accounts for multiple acoustic interactions between all scene objects. The matrix is used at run time to provide sound propagation effects that vary smoothly as sources and listeners move. The second approach couples two techniques---Ambient Reverberance, and Aural Proxies---to provide approximate sound propagation effects in real time, based on only the portion of the environment immediately visible to the listener. These approaches lie at different ends of a space of interactive sound propagation techniques for modeling sound propagation effects in interactive applications. The first approach emphasizes accuracy by modeling acoustic interactions between all parts of the scene; the second approach emphasizes efficiency by only taking the local environment of the listener into account. These methods have been used to efficiently generate acoustic walkthroughs of architectural models. They have also been integrated into a modern game engine, and can enable realistic, interactive sound propagation on commodity desktop PCs.
Are Assumptions of Well-Known Statistical Techniques Checked, and Why (Not)?
Hoekstra, Rink; Kiers, Henk A. L.; Johnson, Addie
2012-01-01
A valid interpretation of most statistical techniques requires that one or more assumptions be met. In published articles, however, little information tends to be reported on whether the data satisfy the assumptions underlying the statistical techniques used. This could be due to self-selection: Only manuscripts with data fulfilling the assumptions are submitted. Another explanation could be that violations of assumptions are rarely checked for in the first place. We studied whether and how 30 researchers checked fictitious data for violations of assumptions in their own working environment. Participants were asked to analyze the data as they would their own data, for which often used and well-known techniques such as the t-procedure, ANOVA and regression (or non-parametric alternatives) were required. It was found that the assumptions of the techniques were rarely checked, and that if they were, it was regularly by means of a statistical test. Interviews afterward revealed a general lack of knowledge about assumptions, the robustness of the techniques with regards to the assumptions, and how (or whether) assumptions should be checked. These data suggest that checking for violations of assumptions is not a well-considered choice, and that the use of statistics can be described as opportunistic. PMID:22593746
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-01-01
Aims and Objectives: The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. Materials and Methods: A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Results: Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions (P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant (P < 0.001). Conclusions: Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems. PMID:28713763
Evaluation of Three Different Processing Techniques in the Fabrication of Complete Dentures.
Chintalacheruvu, Vamsi Krishna; Balraj, Rajasekaran Uttukuli; Putchala, Lavanya Sireesha; Pachalla, Sreelekha
2017-06-01
The objective of the present study is to compare the effectiveness of three different processing techniques and to find out the accuracy of processing techniques through number of occlusal interferences and increase in vertical dimension after denture processing. A cross-sectional study was conducted on a sample of 18 patients indicated for complete denture fabrication was selected for the study and they were divided into three subgroups. Three processing techniques, compression molding and injection molding using prepolymerized resin and unpolymerized resin, were used to fabricate dentures for each of the groups. After processing, laboratory-remounted dentures were evaluated for number of occlusal interferences in centric and eccentric relations and change in vertical dimension through vertical pin rise in articulator. Data were analyzed using statistical test ANOVA and SPSS software version 19.0 by IBM was used. Data obtained from three groups were subjected to one-way ANOVA test. After ANOVA test, results with significant variations were subjected to post hoc test. Number of occlusal interferences with compression molding technique was reported to be more in both centric and eccentric positions as compared to the two injection molding techniques with statistical significance in centric, protrusive, right lateral nonworking, and left lateral working positions ( P < 0.05). Mean vertical pin rise (0.52 mm) was reported to more in compression molding technique as compared to injection molding techniques, which is statistically significant ( P < 0.001). Within the limitations of this study, injection molding techniques exhibited less processing errors as compared to compression molding technique with statistical significance. There was no statistically significant difference in processing errors reported within two injection molding systems.
Photovoltaic power system reliability considerations
NASA Technical Reports Server (NTRS)
Lalli, V. R.
1980-01-01
An example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems is presented. This particular application is for a solar cell power system demonstration project designed to provide electric power requirements for remote villages. The techniques utilized involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of fail-safe and planned spare parts engineering philosophy.
Stochastic Feedforward Control Technique
NASA Technical Reports Server (NTRS)
Halyo, Nesim
1990-01-01
Class of commanded trajectories modeled as stochastic process. Advanced Transport Operating Systems (ATOPS) research and development program conducted by NASA Langley Research Center aimed at developing capabilities for increases in capacities of airports, safe and accurate flight in adverse weather conditions including shear, winds, avoidance of wake vortexes, and reduced consumption of fuel. Advances in techniques for design of modern controls and increased capabilities of digital flight computers coupled with accurate guidance information from Microwave Landing System (MLS). Stochastic feedforward control technique developed within context of ATOPS program.
Photovoltaic power system reliability considerations
NASA Technical Reports Server (NTRS)
Lalli, V. R.
1980-01-01
This paper describes an example of how modern engineering and safety techniques can be used to assure the reliable and safe operation of photovoltaic power systems. This particular application was for a solar cell power system demonstration project in Tangaye, Upper Volta, Africa. The techniques involve a definition of the power system natural and operating environment, use of design criteria and analysis techniques, an awareness of potential problems via the inherent reliability and FMEA methods, and use of a fail-safe and planned spare parts engineering philosophy.
In analyses supporting the development of numeric nutrient criteria, multiple statistical techniques can be used to extract critical values from stressor response relationships. However there is little guidance for choosing among techniques, and the extent to which log-transfor...
Incorporating Nonparametric Statistics into Delphi Studies in Library and Information Science
ERIC Educational Resources Information Center
Ju, Boryung; Jin, Tao
2013-01-01
Introduction: The Delphi technique is widely used in library and information science research. However, many researchers in the field fail to employ standard statistical tests when using this technique. This makes the technique vulnerable to criticisms of its reliability and validity. The general goal of this article is to explore how…
ERIC Educational Resources Information Center
Karadag, Engin
2010-01-01
To assess research methods and analysis of statistical techniques employed by educational researchers, this study surveyed unpublished doctoral dissertation from 2003 to 2007. Frequently used research methods consisted of experimental research; a survey; a correlational study; and a case study. Descriptive statistics, t-test, ANOVA, factor…
Statistics in the Workplace: A Survey of Use by Recent Graduates with Higher Degrees
ERIC Educational Resources Information Center
Harraway, John A.; Barker, Richard J.
2005-01-01
A postal survey was conducted regarding statistical techniques, research methods and software used in the workplace by 913 graduates with PhD and Masters degrees in the biological sciences, psychology, business, economics, and statistics. The study identified gaps between topics and techniques learned at university and those used in the workplace,…
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Muhs, Daniel R.; Ager, Thomas A.; Been, Josh M.; Rosenbaum, Joseph G.; Reynolds, Richard J.
2000-01-01
The presence of buried soils in Alaskan loess is controversial, and therefore criteria for identifying buried soils in these deposits need to be evaluated. In this paper, morphologic and chemical criteria for identifying buried soils are evaluated by studying modern soils developed mostly in Holocene loess under tundra, boreal forest, and transitional coastal-boreal forest vegetation in different parts of Alaska. Data from modern Alaskan soils that developed under vegetation similar to that of the present indicate that soil morphology, organic-matter concentrations, and P concentrations can be useful diagnostic tools for identifying buried soils. Soil morphologic criteria, particularly horizon colors and horizon sequences, are essential for identifying buried soils, but some minimally developed soils may resemble organic-rich alluvial, colluvial, or lacustrine deposits. Organic matter and total P contents and distributions can aid in such studies because in well-drained soils these constituents show rapid declines with depth. However, neither of these techniques may work if the upper genetic horizons of buried soils are eroded.If buried soils are present in Alaskan loess, it would also be desirable to have techniques for determining the dominant vegetation under which the soils formed. Such techniques could then be used to reconstruct former vegetation types and paleoclimates in Alaska. A previous study suggested that tundra and boreal forest vegetation have distinctive carbon isotopic compositions, although both are dominated by C3 plants. If this is the case, then the carbon isotopic composition of organic matter in buried soils could be used to reconstruct former vegetation types. A larger suite of modern soils from Alaskan tundra and forest were analyzed to test this hypothesis. Results indicate that modern soil O horizons in these two biomes have the same range of δ13C values, and therefore carbon isotope compositions cannot be used to reconstruct former tundra or boreal forest.
Dependency graph for code analysis on emerging architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shashkov, Mikhail Jurievich; Lipnikov, Konstantin
Direct acyclic dependency (DAG) graph is becoming the standard for modern multi-physics codes.The ideal DAG is the true block-scheme of a multi-physics code. Therefore, it is the convenient object for insitu analysis of the cost of computations and algorithmic bottlenecks related to statistical frequent data motion and dymanical machine state.
Measuring Boltzmann's Constant with Carbon Dioxide
ERIC Educational Resources Information Center
Ivanov, Dragia; Nikolov, Stefan
2013-01-01
In this paper we present two experiments to measure Boltzmann's constant--one of the fundamental constants of modern-day physics, which lies at the base of statistical mechanics and thermodynamics. The experiments use very basic theory, simple equipment and cheap and safe materials yet provide very precise results. They are very easy and…
Adult Literacy Issues, Programs, and Options. Updated.
ERIC Educational Resources Information Center
Irwin, Paul M.
Media reports suggest widespread illiteracy among adults who may not be able to read, write, speak, or otherwise communicate competently enough to meet the demands of modern society. There is no consensus on the definition of illiteracy or supporting statistics. According to the U.S. Department of Education, the adult illiteracy rate is 13…