Statistical Tests of Reliability of NDE
NASA Technical Reports Server (NTRS)
Baaklini, George Y.; Klima, Stanley J.; Roth, Don J.; Kiser, James D.
1987-01-01
Capabilities of advanced material-testing techniques analyzed. Collection of four reports illustrates statistical method for characterizing flaw-detecting capabilities of sophisticated nondestructive evaluation (NDE). Method used to determine reliability of several state-of-the-art NDE techniques for detecting failure-causing flaws in advanced ceramic materials considered for use in automobiles, airplanes, and space vehicles.
ERIC Educational Resources Information Center
Hester, Yvette
Least squares methods are sophisticated mathematical curve fitting procedures used in all classical parametric methods. The linear least squares approximation is most often associated with finding the "line of best fit" or the regression line. Since all statistical analyses are correlational and all classical parametric methods are least…
ERIC Educational Resources Information Center
Braun, W. John
2012-01-01
The Analysis of Variance is often taught in introductory statistics courses, but it is not clear that students really understand the method. This is because the derivation of the test statistic and p-value requires a relatively sophisticated mathematical background which may not be well-remembered or understood. Thus, the essential concept behind…
Profile Of 'Original Articles' Published In 2016 By The Journal Of Ayub Medical College, Pakistan.
Shaikh, Masood Ali
2018-01-01
Journal of Ayub Medical College (JAMC) is the only Medline indexed biomedical journal of Pakistan that is edited and published by a medical college. Assessing the trends of study designs employed, statistical methods used, and statistical analysis software used in the articles of medical journals help understand the sophistication of research published. The objectives of this descriptive study were to assess all original articles published by JAMC in the year 2016. JAMC published 147 original articles in the year 2016. The most commonly used study design was crosssectional studies, with 64 (43.5%) articles reporting its use. Statistical tests involving bivariate analysis were most common and reported by 73 (49.6%) articles. Use of SPSS software was reported by 109 (74.1%) of articles. Most 138 (93.9%) of the original articles published were based on studies conducted in Pakistan. The number and sophistication of analysis reported in JAMC increased from year 2014 to 2016.
Medical subject heading (MeSH) annotations illuminate maize genetics and evolution
USDA-ARS?s Scientific Manuscript database
In the modern era, high-density marker panels and/or whole-genome sequencing,coupled with advanced phenotyping pipelines and sophisticated statistical methods, have dramatically increased our ability to generate lists of candidate genes or regions that are putatively associated with phenotypes or pr...
The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership
ERIC Educational Resources Information Center
Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.
2011-01-01
The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…
Test Design with Cognition in Mind
ERIC Educational Resources Information Center
Gorin, Joanna S.
2006-01-01
One of the primary themes of the National Research Council's 2001 book "Knowing What Students Know" was the importance of cognition as a component of assessment design and measurement theory (NRC, 2001). One reaction to the book has been an increased use of sophisticated statistical methods to model cognitive information available in test data.…
Have the Focus and Sophistication of Research in Health Education Changed?
ERIC Educational Resources Information Center
Merrill, Ray M.; Lindsay, Christopher A.; Shields, Eric C.; Stoddard, Julianne
2007-01-01
This study assessed the types of research and the statistical methods used in three representative health education journals from 1994 through 2003. Editorials, commentaries, program/practice notes, and perspectives represent 17.6% of the journals' content. The most common types of articles are cross-sectional studies (27.5%), reviews (23.2%), and…
Statistical Approaches Used to Assess the Equity of Access to Food Outlets: A Systematic Review
Lamb, Karen E.; Thornton, Lukar E.; Cerin, Ester; Ball, Kylie
2015-01-01
Background Inequalities in eating behaviours are often linked to the types of food retailers accessible in neighbourhood environments. Numerous studies have aimed to identify if access to healthy and unhealthy food retailers is socioeconomically patterned across neighbourhoods, and thus a potential risk factor for dietary inequalities. Existing reviews have examined differences between methodologies, particularly focussing on neighbourhood and food outlet access measure definitions. However, no review has informatively discussed the suitability of the statistical methodologies employed; a key issue determining the validity of study findings. Our aim was to examine the suitability of statistical approaches adopted in these analyses. Methods Searches were conducted for articles published from 2000–2014. Eligible studies included objective measures of the neighbourhood food environment and neighbourhood-level socio-economic status, with a statistical analysis of the association between food outlet access and socio-economic status. Results Fifty-four papers were included. Outlet accessibility was typically defined as the distance to the nearest outlet from the neighbourhood centroid, or as the number of food outlets within a neighbourhood (or buffer). To assess if these measures were linked to neighbourhood disadvantage, common statistical methods included ANOVA, correlation, and Poisson or negative binomial regression. Although all studies involved spatial data, few considered spatial analysis techniques or spatial autocorrelation. Conclusions With advances in GIS software, sophisticated measures of neighbourhood outlet accessibility can be considered. However, approaches to statistical analysis often appear less sophisticated. Care should be taken to consider assumptions underlying the analysis and the possibility of spatially correlated residuals which could affect the results. PMID:29546115
Practice-based evidence study design for comparative effectiveness research.
Horn, Susan D; Gassaway, Julie
2007-10-01
To describe a new, rigorous, comprehensive practice-based evidence for clinical practice improvement (PBE-CPI) study methodology, and compare its features, advantages, and disadvantages to those of randomized controlled trials and sophisticated statistical methods for comparative effectiveness research. PBE-CPI incorporates natural variation within data from routine clinical practice to determine what works, for whom, when, and at what cost. It uses the knowledge of front-line caregivers, who develop study questions and define variables as part of a transdisciplinary team. Its comprehensive measurement framework provides a basis for analyses of significant bivariate and multivariate associations between treatments and outcomes, controlling for patient differences, such as severity of illness. PBE-CPI studies can uncover better practices more quickly than randomized controlled trials or sophisticated statistical methods, while achieving many of the same advantages. We present examples of actionable findings from PBE-CPI studies in postacute care settings related to comparative effectiveness of medications, nutritional support approaches, incontinence products, physical therapy activities, and other services. Outcomes improved when practices associated with better outcomes in PBE-CPI analyses were adopted in practice.
Statistical Mechanics of Combinatorial Auctions
NASA Astrophysics Data System (ADS)
Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo
2006-09-01
Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.
Data mining: sophisticated forms of managed care modeling through artificial intelligence.
Borok, L S
1997-01-01
Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.
Statistics for wildlifers: how much and what kind?
Johnson, D.H.; Shaffer, T.L.; Newton, W.E.
2001-01-01
Quantitative methods are playing increasingly important roles in wildlife ecology and, ultimately, management. This change poses a challenge for wildlife practitioners and students who are not well-educated in mathematics and statistics. Here we give our opinions on what wildlife biologists should know about statistics, while recognizing that not everyone is inclined mathematically. For those who are, we recommend that they take mathematics coursework at least through calculus and linear algebra. They should take statistics courses that are focused conceptually , stressing the Why rather than the How of doing statistics. For less mathematically oriented wildlifers, introductory classes in statistical techniques will furnish some useful background in basic methods but may provide little appreciation of when the methods are appropriate. These wildlifers will have to rely much more on advice from statisticians. Far more important than knowing how to analyze data is an understanding of how to obtain and recognize good data. Regardless of the statistical education they receive, all wildlife biologists should appreciate the importance of controls, replication, and randomization in studies they conduct. Understanding these concepts requires little mathematical sophistication, but is critical to advancing the science of wildlife ecology.
ERIC Educational Resources Information Center
Sadd, James; Morello-Frosch, Rachel; Pastor, Manuel; Matsuoka, Martha; Prichard, Michele; Carter, Vanessa
2014-01-01
Environmental justice advocates often argue that environmental hazards and their health effects vary by neighborhood, income, and race. To assess these patterns and advance preventive policy, their colleagues in the research world often use complex and methodologically sophisticated statistical and geospatial techniques. One way to bridge the gap…
Healthy Worker Effect Phenomenon: Revisited with Emphasis on Statistical Methods – A Review
Chowdhury, Ritam; Shah, Divyang; Payal, Abhishek R.
2017-01-01
Known since 1885 but studied systematically only in the past four decades, the healthy worker effect (HWE) is a special form of selection bias common to occupational cohort studies. The phenomenon has been under debate for many years with respect to its impact, conceptual approach (confounding, selection bias, or both), and ways to resolve or account for its effect. The effect is not uniform across age groups, gender, race, and types of occupations and nor is it constant over time. Hence, assessing HWE and accounting for it in statistical analyses is complicated and requires sophisticated methods. Here, we review the HWE, factors affecting it, and methods developed so far to deal with it. PMID:29391741
Medical history and epidemiology: their contribution to the development of public health nursing.
Earl, Catherine E
2009-01-01
The nursing profession historically has been involved in data collection in research efforts notably from the time of the Framingham Tuberculosis Project (1914-1923). Over the past century, nurses have become more sophisticated in their abilities to design, conduct, and analyze data. This article discusses the contributions of medicine and epidemiology to the development of public health nursing and the use of statistical methods by nurses in the United States in the 19th and 20th centuries. Knowledge acquired from this article will inform educators and researchers about the importance of using quantitative analysis, evidenced-based knowledge, and statistical methods when teaching students in all health professions.
Statistical learning and selective inference.
Taylor, Jonathan; Tibshirani, Robert J
2015-06-23
We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.
Lee, Juneyoung; Kim, Kyung Won; Choi, Sang Hyun; Huh, Jimi
2015-01-01
Meta-analysis of diagnostic test accuracy studies differs from the usual meta-analysis of therapeutic/interventional studies in that, it is required to simultaneously analyze a pair of two outcome measures such as sensitivity and specificity, instead of a single outcome. Since sensitivity and specificity are generally inversely correlated and could be affected by a threshold effect, more sophisticated statistical methods are required for the meta-analysis of diagnostic test accuracy. Hierarchical models including the bivariate model and the hierarchical summary receiver operating characteristic model are increasingly being accepted as standard methods for meta-analysis of diagnostic test accuracy studies. We provide a conceptual review of statistical methods currently used and recommended for meta-analysis of diagnostic test accuracy studies. This article could serve as a methodological reference for those who perform systematic review and meta-analysis of diagnostic test accuracy studies. PMID:26576107
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
SimHap GUI: an intuitive graphical user interface for genetic association analysis.
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-12-25
Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.
1991-01-01
EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE
Two-Year College Mathematics Instructors' Conceptions of Variation
ERIC Educational Resources Information Center
Dabos, Monica Graciela Gandhini
2011-01-01
Statistics education researchers are urging teachers of statistics to help students develop a more sophisticated understanding of variation, since variation is the core of statistics. However, little research has been done into the conceptions of variation held by instructors of statistics. This is of particular importance at the community college…
ERIC Educational Resources Information Center
DeMark, Sarah F.; Behrens, John T.
2004-01-01
Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…
SimHap GUI: An intuitive graphical user interface for genetic association analysis
Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J
2008-01-01
Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877
The kappa statistic in rehabilitation research: an examination.
Tooth, Leigh R; Ottenbacher, Kenneth J
2004-08-01
The number and sophistication of statistical procedures reported in medical rehabilitation research is increasing. Application of the principles and methods associated with evidence-based practice has contributed to the need for rehabilitation practitioners to understand quantitative methods in published articles. Outcomes measurement and determination of reliability are areas that have experienced rapid change during the past decade. In this study, distinctions between reliability and agreement are examined. Information is presented on analytical approaches for addressing reliability and agreement with the focus on the application of the kappa statistic. The following assumptions are discussed: (1) kappa should be used with data measured on a categorical scale, (2) the patients or objects categorized should be independent, and (3) the observers or raters must make their measurement decisions and judgments independently. Several issues related to using kappa in measurement studies are described, including use of weighted kappa, methods of reporting kappa, the effect of bias and prevalence on kappa, and sample size and power requirements for kappa. The kappa statistic is useful for assessing agreement among raters, and it is being used more frequently in rehabilitation research. Correct interpretation of the kappa statistic depends on meeting the required assumptions and accurate reporting.
Statistical Inferences from Formaldehyde Dna-Protein Cross-Link Data
Physiologically-based pharmacokinetic (PBPK) modeling has reached considerable sophistication in its application in the pharmacological and environmental health areas. Yet, mature methodologies for making statistical inferences have not been routinely incorporated in these applic...
Teaching Statistics--Despite Its Applications
ERIC Educational Resources Information Center
Ridgway, Jim; Nicholson, James; McCusker, Sean
2007-01-01
Evidence-based policy requires sophisticated modelling and reasoning about complex social data. The current UK statistics curricula do not equip tomorrow's citizens to understand such reasoning. We advocate radical curriculum reform, designed to require students to reason from complex data.
Automatic classification of bottles in crates
NASA Astrophysics Data System (ADS)
Aas, Kjersti; Eikvil, Line; Bremnes, Dag; Norbryhn, Andreas
1995-03-01
This paper presents a statistical method for classification of bottles in crates for use in automatic return bottle machines. For the automatons to reimburse the correct deposit, a reliable recognition is important. The images are acquired by a laser range scanner coregistering the distance to the object and the strength of the reflected signal. The objective is to identify the crate and the bottles from a library with a number of legal types. The bottles with significantly different size are separated using quite simple methods, while a more sophisticated recognizer is required to distinguish the more similar bottle types. Good results have been obtained when testing the method developed on bottle types which are difficult to distinguish using simple methods.
[Hungarian health resource allocation from the viewpoint of the English methodology].
Fadgyas-Freyler, Petra
2018-02-01
This paper describes both the English health resource allocation and the attempt of its Hungarian adaptation. We describe calculations for a Hungarian regression model using the English 'weighted capitation formula'. The model has proven statistically correct. New independent variables and homogenous regional units have to be found for Hungary. The English method can be used with adequate variables. Hungarian patient-level health data can support a much more sophisticated model. Further research activities are needed. Orv Hetil. 2018; 159(5): 183-191.
ERIC Educational Resources Information Center
Watson, Jane
2007-01-01
Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…
ERIC Educational Resources Information Center
Burks, Robert E.; Jaye, Michael J.
2012-01-01
The "Price Is Right" ("TPIR") provides a wealth of material for studying statistics at various levels of mathematical sophistication. The authors have used elements of this show to motivate students from undergraduate probability and statistics courses to graduate level executive management courses. The material consistently generates a high…
Missing data imputation: focusing on single imputation.
Zhang, Zhongheng
2016-01-01
Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations.
Missing data imputation: focusing on single imputation
2016-01-01
Complete case analysis is widely used for handling missing data, and it is the default method in many statistical packages. However, this method may introduce bias and some useful information will be omitted from analysis. Therefore, many imputation methods are developed to make gap end. The present article focuses on single imputation. Imputations with mean, median and mode are simple but, like complete case analysis, can introduce bias on mean and deviation. Furthermore, they ignore relationship with other variables. Regression imputation can preserve relationship between missing values and other variables. There are many sophisticated methods exist to handle missing values in longitudinal data. This article focuses primarily on how to implement R code to perform single imputation, while avoiding complex mathematical calculations. PMID:26855945
Experimental design and quantitative analysis of microbial community multiomics.
Mallick, Himel; Ma, Siyuan; Franzosa, Eric A; Vatanen, Tommi; Morgan, Xochitl C; Huttenhower, Curtis
2017-11-30
Studies of the microbiome have become increasingly sophisticated, and multiple sequence-based, molecular methods as well as culture-based methods exist for population-scale microbiome profiles. To link the resulting host and microbial data types to human health, several experimental design considerations, data analysis challenges, and statistical epidemiological approaches must be addressed. Here, we survey current best practices for experimental design in microbiome molecular epidemiology, including technologies for generating, analyzing, and integrating microbiome multiomics data. We highlight studies that have identified molecular bioactives that influence human health, and we suggest steps for scaling translational microbiome research to high-throughput target discovery across large populations.
ERIC Educational Resources Information Center
Kaplan, David
This paper offers recommendations to the National Center for Education Statistics (NCES) on the development of the background questionnaire for the National Assessment of Adult Literacy (NAAL). The recommendations are from the viewpoint of a researcher interested in applying sophisticated statistical models to address important issues in adult…
NASA Astrophysics Data System (ADS)
Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha
2015-01-01
Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.
Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren
2014-01-01
Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, B. P.; Valdez, C. A.; DeHope, A. J.
Critical to many modern forensic investigations is the chemical attribution of the origin of an illegal drug. This process greatly relies on identification of compounds indicative of its clandestine or commercial production. The results of these studies can yield detailed information on method of manufacture, sophistication of the synthesis operation, starting material source, and final product. In the present work, chemical attribution signatures (CAS) associated with the synthesis of the analgesic 3- methylfentanyl, N-(3-methyl-1-phenethylpiperidin-4-yl)-N-phenylpropanamide, were investigated. Six synthesis methods were studied in an effort to identify and classify route-specific signatures. These methods were chosen to minimize the use of scheduledmore » precursors, complicated laboratory equipment, number of overall steps, and demanding reaction conditions. Using gas and liquid chromatographies combined with mass spectrometric methods (GC-QTOF and LC-QTOF) in conjunction with inductivelycoupled plasma mass spectrometry (ICP-MS), over 240 distinct compounds and elements were monitored. As seen in our previous work with CAS of fentanyl synthesis the complexity of the resultant data matrix necessitated the use of multivariate statistical analysis. Using partial least squares discriminant analysis (PLS-DA), 62 statistically significant, route-specific CAS were identified. Statistical classification models using a variety of machine learning techniques were then developed with the ability to predict the method of 3-methylfentanyl synthesis from three blind crude samples generated by synthetic chemists without prior experience with these methods.« less
The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.
Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L
2017-06-01
To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.
Advances in borehole geophysics for hydrology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, P.H.
1982-01-01
Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less
The Sophistical Attitude and the Invention of Rhetoric
ERIC Educational Resources Information Center
Crick, Nathan
2010-01-01
Traditionally, the Older Sophists were conceived as philosophical skeptics who rejected speculative inquiry to focus on rhetorical methods of being successful in practical life. More recently, this view has been complicated by studies revealing the Sophists to be a diverse group of intellectuals who practiced their art prior to the categorization…
User-customized brain computer interfaces using Bayesian optimization
NASA Astrophysics Data System (ADS)
Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali
2016-04-01
Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
Naive vs. Sophisticated Methods of Forecasting Public Library Circulations.
ERIC Educational Resources Information Center
Brooks, Terrence A.
1984-01-01
Two sophisticated--autoregressive integrated moving average (ARIMA), straight-line regression--and two naive--simple average, monthly average--forecasting techniques were used to forecast monthly circulation totals of 34 public libraries. Comparisons of forecasts and actual totals revealed that ARIMA and monthly average methods had smallest mean…
A close examination of double filtering with fold change and t test in microarray analysis
2009-01-01
Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439
Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren
2014-01-01
Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of ‘musical sophistication’ which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement. PMID:24586929
Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach.
Hofmans, Joeri
2017-01-01
A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories-in the form of the dynamic model of the psychological contract-and research methods-in the form of daily diary research and experience sampling research-are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models-the Zero-Inflated model and the Hurdle model-that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue.
Reproducible research in vadose zone sciences
USDA-ARS?s Scientific Manuscript database
A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...
DESIGN OF EXPOSURE MEASUREMENTS FOR EPIDEMIOLOGIC STUDIES
This presentation will describe the following items: (1) London daily air pollution and deaths that demonstrate how time series epidemiology can indicate that air pollution caused death; (2) Sophisticated statistical models required to establish this relationship for lower pollut...
Cancer concepts and principles: primer for the interventional oncologist-part I.
Hickey, Ryan; Vouche, Michael; Sze, Daniel Y; Hohlastos, Elias; Collins, Jeremy; Schirmang, Todd; Memon, Khairuddin; Ryu, Robert K; Sato, Kent; Chen, Richard; Gupta, Ramona; Resnick, Scott; Carr, James; Chrisman, Howard B; Nemcek, Albert A; Vogelzang, Robert L; Lewandowski, Robert J; Salem, Riad
2013-08-01
A sophisticated understanding of the rapidly changing field of oncology, including a broad knowledge of oncologic disease and the therapies available to treat them, is fundamental to the interventional radiologist providing oncologic therapies, and is necessary to affirm interventional oncology as one of the four pillars of cancer care alongside medical, surgical, and radiation oncology. The first part of this review intends to provide a concise overview of the fundamentals of oncologic clinical trials, including trial design, methods to assess therapeutic response, common statistical analyses, and the levels of evidence provided by clinical trials. Copyright © 2013 SIR. Published by Elsevier Inc. All rights reserved.
An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.
Undrill, P E; Frazer, S C
1979-01-01
A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340
A Tutorial on Adaptive Design Optimization
Myung, Jay I.; Cavagnaro, Daniel R.; Pitt, Mark A.
2013-01-01
Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct “smart” experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond. PMID:23997275
Trutschel, Diana; Palm, Rebecca; Holle, Bernhard; Simon, Michael
2017-11-01
Because not every scientific question on effectiveness can be answered with randomised controlled trials, research methods that minimise bias in observational studies are required. Two major concerns influence the internal validity of effect estimates: selection bias and clustering. Hence, to reduce the bias of the effect estimates, more sophisticated statistical methods are needed. To introduce statistical approaches such as propensity score matching and mixed models into representative real-world analysis and to conduct the implementation in statistical software R to reproduce the results. Additionally, the implementation in R is presented to allow the results to be reproduced. We perform a two-level analytic strategy to address the problems of bias and clustering: (i) generalised models with different abilities to adjust for dependencies are used to analyse binary data and (ii) the genetic matching and covariate adjustment methods are used to adjust for selection bias. Hence, we analyse the data from two population samples, the sample produced by the matching method and the full sample. The different analysis methods in this article present different results but still point in the same direction. In our example, the estimate of the probability of receiving a case conference is higher in the treatment group than in the control group. Both strategies, genetic matching and covariate adjustment, have their limitations but complement each other to provide the whole picture. The statistical approaches were feasible for reducing bias but were nevertheless limited by the sample used. For each study and obtained sample, the pros and cons of the different methods have to be weighted. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
de Sá, Joceline Cássia Ferezini; Marini, Gabriela; Gelaleti, Rafael Bottaro; da Silva, João Batista; de Azevedo, George Gantas; Rudge, Marilza Vieira Cunha
2013-11-01
To evaluate the methodological and statistical design evolution of the publications in the Brazilian Journal of Gynecology and Obstetrics (RBGO) from resolution 196/96. A review of 133 articles published in 1999 (65) and 2009 (68) was performed by two independent reviewers with training in clinical epidemiology and methodology of scientific research. We included all original clinical articles, case and series reports and excluded editorials, letters to the editor, systematic reviews, experimental studies, opinion articles, besides abstracts of theses and dissertations. Characteristics related to the methodological quality of the studies were analyzed in each article using a checklist that evaluated two criteria: methodological aspects and statistical procedures. We used descriptive statistics and the χ2 test for comparison of the two years. There was a difference between 1999 and 2009 regarding the study and statistical design, with more accuracy in the procedures and the use of more robust tests between 1999 and 2009. In RBGO, we observed an evolution in the methods of published articles and a more in-depth use of the statistical analyses, with more sophisticated tests such as regression and multilevel analyses, which are essential techniques for the knowledge and planning of health interventions, leading to fewer interpretation errors.
NASA Technical Reports Server (NTRS)
Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.
2013-01-01
Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post-classification steps. Within this chapter, each of the four approaches is described in terms of scale and accuracy classifying urban land use and urban land cover; and for its range of urban applications. We demonstrate the overview of four main classification groups in Figure 1 while Table 1 details the approaches with respect to classification requirements and procedures (e.g., reflectance conversion, steps before training sample selection, training samples, spatial approaches commonly used, classifiers, primary inputs for classification, output structures, number of output layers, and accuracy assessment). The chapter concludes with a brief summary of the methods reviewed and the challenges that remain in developing new classification methods for improving the efficiency and accuracy of mapping urban areas.
Protein Structure Classification and Loop Modeling Using Multiple Ramachandran Distributions.
Najibi, Seyed Morteza; Maadooliat, Mehdi; Zhou, Lan; Huang, Jianhua Z; Gao, Xin
2017-01-01
Recently, the study of protein structures using angular representations has attracted much attention among structural biologists. The main challenge is how to efficiently model the continuous conformational space of the protein structures based on the differences and similarities between different Ramachandran plots. Despite the presence of statistical methods for modeling angular data of proteins, there is still a substantial need for more sophisticated and faster statistical tools to model the large-scale circular datasets. To address this need, we have developed a nonparametric method for collective estimation of multiple bivariate density functions for a collection of populations of protein backbone angles. The proposed method takes into account the circular nature of the angular data using trigonometric spline which is more efficient compared to existing methods. This collective density estimation approach is widely applicable when there is a need to estimate multiple density functions from different populations with common features. Moreover, the coefficients of adaptive basis expansion for the fitted densities provide a low-dimensional representation that is useful for visualization, clustering, and classification of the densities. The proposed method provides a novel and unique perspective to two important and challenging problems in protein structure research: structure-based protein classification and angular-sampling-based protein loop structure prediction.
STRengthening Analytical Thinking for Observational Studies: the STRATOS initiative
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-01-01
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even ‘standard’ analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. PMID:25074480
WebArray: an online platform for microarray data analysis
Xia, Xiaoqin; McClelland, Michael; Wang, Yipeng
2005-01-01
Background Many cutting-edge microarray analysis tools and algorithms, including commonly used limma and affy packages in Bioconductor, need sophisticated knowledge of mathematics, statistics and computer skills for implementation. Commercially available software can provide a user-friendly interface at considerable cost. To facilitate the use of these tools for microarray data analysis on an open platform we developed an online microarray data analysis platform, WebArray, for bench biologists to utilize these tools to explore data from single/dual color microarray experiments. Results The currently implemented functions were based on limma and affy package from Bioconductor, the spacings LOESS histogram (SPLOSH) method, PCA-assisted normalization method and genome mapping method. WebArray incorporates these packages and provides a user-friendly interface for accessing a wide range of key functions of limma and others, such as spot quality weight, background correction, graphical plotting, normalization, linear modeling, empirical bayes statistical analysis, false discovery rate (FDR) estimation, chromosomal mapping for genome comparison. Conclusion WebArray offers a convenient platform for bench biologists to access several cutting-edge microarray data analysis tools. The website is freely available at . It runs on a Linux server with Apache and MySQL. PMID:16371165
High-throughput electrical measurement and microfluidic sorting of semiconductor nanowires.
Akin, Cevat; Feldman, Leonard C; Durand, Corentin; Hus, Saban M; Li, An-Ping; Hui, Ho Yee; Filler, Michael A; Yi, Jingang; Shan, Jerry W
2016-05-24
Existing nanowire electrical characterization tools not only are expensive and require sophisticated facilities, but are far too slow to enable statistical characterization of highly variable samples. They are also generally not compatible with further sorting and processing of nanowires. Here, we demonstrate a high-throughput, solution-based electro-orientation-spectroscopy (EOS) method, which is capable of automated electrical characterization of individual nanowires by direct optical visualization of their alignment behavior under spatially uniform electric fields of different frequencies. We demonstrate that EOS can quantitatively characterize the electrical conductivities of nanowires over a 6-order-of-magnitude range (10(-5) to 10 S m(-1), corresponding to typical carrier densities of 10(10)-10(16) cm(-3)), with different fluids used to suspend the nanowires. By implementing EOS in a simple microfluidic device, continuous electrical characterization is achieved, and the sorting of nanowires is demonstrated as a proof-of-concept. With measurement speeds two orders of magnitude faster than direct-contact methods, the automated EOS instrument enables for the first time the statistical characterization of highly variable 1D nanomaterials.
STRengthening analytical thinking for observational studies: the STRATOS initiative.
Sauerbrei, Willi; Abrahamowicz, Michal; Altman, Douglas G; le Cessie, Saskia; Carpenter, James
2014-12-30
The validity and practical utility of observational medical research depends critically on good study design, excellent data quality, appropriate statistical methods and accurate interpretation of results. Statistical methodology has seen substantial development in recent times. Unfortunately, many of these methodological developments are ignored in practice. Consequently, design and analysis of observational studies often exhibit serious weaknesses. The lack of guidance on vital practical issues discourages many applied researchers from using more sophisticated and possibly more appropriate methods when analyzing observational studies. Furthermore, many analyses are conducted by researchers with a relatively weak statistical background and limited experience in using statistical methodology and software. Consequently, even 'standard' analyses reported in the medical literature are often flawed, casting doubt on their results and conclusions. An efficient way to help researchers to keep up with recent methodological developments is to develop guidance documents that are spread to the research community at large. These observations led to the initiation of the strengthening analytical thinking for observational studies (STRATOS) initiative, a large collaboration of experts in many different areas of biostatistical research. The objective of STRATOS is to provide accessible and accurate guidance in the design and analysis of observational studies. The guidance is intended for applied statisticians and other data analysts with varying levels of statistical education, experience and interests. In this article, we introduce the STRATOS initiative and its main aims, present the need for guidance documents and outline the planned approach and progress so far. We encourage other biostatisticians to become involved. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd.
Vomweg, T W; Buscema, M; Kauczor, H U; Teifke, A; Intraligi, M; Terzi, S; Heussel, C P; Achenbach, T; Rieker, O; Mayer, D; Thelen, M
2003-09-01
The aim of this study was to evaluate the capability of improved artificial neural networks (ANN) and additional novel training methods in distinguishing between benign and malignant breast lesions in contrast-enhanced magnetic resonance-mammography (MRM). A total of 604 histologically proven cases of contrast-enhanced lesions of the female breast at MRI were analyzed. Morphological, dynamic and clinical parameters were collected and stored in a database. The data set was divided into several groups using random or experimental methods [Training & Testing (T&T) algorithm] to train and test different ANNs. An additional novel computer program for input variable selection was applied. Sensitivity and specificity were calculated and compared with a statistical method and an expert radiologist. After optimization of the distribution of cases among the training and testing sets by the T & T algorithm and the reduction of input variables by the Input Selection procedure a highly sophisticated ANN achieved a sensitivity of 93.6% and a specificity of 91.9% in predicting malignancy of lesions within an independent prediction sample set. The best statistical method reached a sensitivity of 90.5% and a specificity of 68.9%. An expert radiologist performed better than the statistical method but worse than the ANN (sensitivity 92.1%, specificity 85.6%). Features extracted out of dynamic contrast-enhanced MRM and additional clinical data can be successfully analyzed by advanced ANNs. The quality of the resulting network strongly depends on the training methods, which are improved by the use of novel training tools. The best results of an improved ANN outperform expert radiologists.
Australian Curriculum Linked Lessons: Statistics
ERIC Educational Resources Information Center
Day, Lorraine
2014-01-01
Students recognise and analyse data and draw inferences. They represent, summarise and interpret data and undertake purposeful investigations involving the collection and interpretation of data… They develop an increasingly sophisticated ability to critically evaluate chance and data concepts and make reasoned judgments and decisions, as well as…
Narayanan, Roshni; Nugent, Rebecca; Nugent, Kenneth
2015-10-01
Accreditation Council for Graduate Medical Education guidelines require internal medicine residents to develop skills in the interpretation of medical literature and to understand the principles of research. A necessary component is the ability to understand the statistical methods used and their results, material that is not an in-depth focus of most medical school curricula and residency programs. Given the breadth and depth of the current medical literature and an increasing emphasis on complex, sophisticated statistical analyses, the statistical foundation and education necessary for residents are uncertain. We reviewed the statistical methods and terms used in 49 articles discussed at the journal club in the Department of Internal Medicine residency program at Texas Tech University between January 1, 2013 and June 30, 2013. We collected information on the study type and on the statistical methods used for summarizing and comparing samples, determining the relations between independent variables and dependent variables, and estimating models. We then identified the typical statistics education level at which each term or method is learned. A total of 14 articles came from the Journal of the American Medical Association Internal Medicine, 11 from the New England Journal of Medicine, 6 from the Annals of Internal Medicine, 5 from the Journal of the American Medical Association, and 13 from other journals. Twenty reported randomized controlled trials. Summary statistics included mean values (39 articles), category counts (38), and medians (28). Group comparisons were based on t tests (14 articles), χ2 tests (21), and nonparametric ranking tests (10). The relations between dependent and independent variables were analyzed with simple regression (6 articles), multivariate regression (11), and logistic regression (8). Nine studies reported odds ratios with 95% confidence intervals, and seven analyzed test performance using sensitivity and specificity calculations. These papers used 128 statistical terms and context-defined concepts, including some from data analysis (56), epidemiology-biostatistics (31), modeling (24), data collection (12), and meta-analysis (5). Ten different software programs were used in these articles. Based on usual undergraduate and graduate statistics curricula, 64.3% of the concepts and methods used in these papers required at least a master's degree-level statistics education. The interpretation of the current medical literature can require an extensive background in statistical methods at an education level exceeding the material and resources provided to most medical students and residents. Given the complexity and time pressure of medical education, these deficiencies will be hard to correct, but this project can serve as a basis for developing a curriculum in study design and statistical methods needed by physicians-in-training.
NASA Technical Reports Server (NTRS)
Parrish, R. V.; Steinmetz, G. G.
1972-01-01
A method of parameter extraction for stability and control derivatives of aircraft from flight test data, implementing maximum likelihood estimation, has been developed and successfully applied to actual lateral flight test data from a modern sophisticated jet fighter. This application demonstrates the important role played by the analyst in combining engineering judgment and estimator statistics to yield meaningful results. During the analysis, the problems of uniqueness of the extracted set of parameters and of longitudinal coupling effects were encountered and resolved. The results for all flight runs are presented in tabular form and as time history comparisons between the estimated states and the actual flight test data.
Lo Presti, Rossella; Barca, Emanuele; Passarella, Giuseppe
2010-01-01
Environmental time series are often affected by the "presence" of missing data, but when dealing statistically with data, the need to fill in the gaps estimating the missing values must be considered. At present, a large number of statistical techniques are available to achieve this objective; they range from very simple methods, such as using the sample mean, to very sophisticated ones, such as multiple imputation. A brand new methodology for missing data estimation is proposed, which tries to merge the obvious advantages of the simplest techniques (e.g. their vocation to be easily implemented) with the strength of the newest techniques. The proposed method consists in the application of two consecutive stages: once it has been ascertained that a specific monitoring station is affected by missing data, the "most similar" monitoring stations are identified among neighbouring stations on the basis of a suitable similarity coefficient; in the second stage, a regressive method is applied in order to estimate the missing data. In this paper, four different regressive methods are applied and compared, in order to determine which is the most reliable for filling in the gaps, using rainfall data series measured in the Candelaro River Basin located in South Italy.
Attitudes about high school physics in relationship to gender and ethnicity: A mixed method analysis
NASA Astrophysics Data System (ADS)
Hafza, Rabieh Jamal
There is an achievement gap and lack of participation in science, technology, engineering, and math (STEM) by minority females. The number of minority females majoring in STEM related fields and earning advanced degrees in these fields has not significantly increased over the past 40 years. Previous research has evaluated the relationship between self-identity concept and factors that promote the academic achievement as well the motivation of students to study different subject areas. This study examined the interaction between gender and ethnicity in terms of physics attitudes in the context of real world connections, personal interest, sense making/effort, problem solving confidence, and problem solving sophistication. The Colorado Learning Attitudes about Science Survey (CLASS) was given to 131 students enrolled in physics classes. There was a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, Sense Making/Effort, Problem Solving Confidence, and Problem Solving Sophistication as a whole. There was also a statistically significant Gender*Ethnicity interaction for attitude in the context of Real World Connections, Personal Interest, and Sense Making/Effort individually. Five Black females were interviewed to triangulate the quantitative results and to describe the experiences of minority females taking physics classes. There were four themes that emerged from the interviews and supported the findings from the quantitative results. The data supported previous research done on attitudes about STEM. The results reported that Real World Connections and Personal Interest could be possible factors that explain the lack of participation and achievement gaps that exists among minority females.
Yin, Ke; Dou, Xiaomin; Ren, Fei; Chan, Wei-Ping; Chang, Victor Wei-Chung
2018-02-15
Bottom ashes generated from municipal solid waste incineration have gained increasing popularity as alternative construction materials, however, they contains elevated heavy metals posing a challenge for its free usage. Different leaching methods are developed to quantify leaching potential of incineration bottom ashes meanwhile guide its environmentally friendly application. Yet, there are diverse IBA applications while the in situ environment is always complicated, challenging its legislation. In this study, leaching tests were conveyed using batch and column leaching methods with seawater as opposed to deionized water, to unveil the metal leaching potential of IBA subjected to salty environment, which is commonly encountered when using IBA in land reclamation yet not well understood. Statistical analysis for different leaching methods suggested disparate performance between seawater and deionized water primarily ascribed to ionic strength. Impacts of leachant are metal-specific dependent on leaching methods and have a function of intrinsic characteristics of incineration bottom ashes. Leaching performances were further compared on additional perspectives, e.g. leaching approach and liquid to solid ratio, indicating sophisticated leaching potentials dominated by combined geochemistry. It is necessary to develop application-oriented leaching methods with corresponding leaching criteria to preclude discriminations between different applications, e.g., terrestrial applications vs. land reclamation. Copyright © 2017 Elsevier B.V. All rights reserved.
Enrollment Projections: Template and Guide.
ERIC Educational Resources Information Center
Findlen, George L.
Small community colleges with enrollments between 500 and 2,500 students have traditionally been unable to afford to hire an institutional researcher or to lease sophisticated statistical packages to perform enrollment analyses, though their needs for enrollment projections are the same as those of larger institutions. Fortunately, with a personal…
ERIC Educational Resources Information Center
Hovey, Sheryl
Statistics indicating that the problem of illiteracy is lessening mask a greater problem--that of functional illiteracy. Functional illiterates may have some reading and writing skills but are not able to apply them as functioning members of society. A 1975 study using the most sophisticated instrument that had ever been used to determine…
The Emergence of Contextual Social Psychology.
Pettigrew, Thomas F
2018-07-01
Social psychology experiences recurring so-called "crises." This article maintains that these episodes actually mark advances in the discipline; these "crises" have enhanced relevance and led to greater methodological and statistical sophistication. New statistical tools have allowed social psychologists to begin to achieve a major goal: placing psychological phenomena in their larger social contexts. This growing trend is illustrated with numerous recent studies; they demonstrate how cultures and social norms moderate basic psychological processes. Contextual social psychology is finally emerging.
Composite Load Spectra for Select Space Propulsion Structural Components
NASA Technical Reports Server (NTRS)
Ho, Hing W.; Newell, James F.
1994-01-01
Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.
NASA Technical Reports Server (NTRS)
Kinne, S.; Wiscombe, Warren; Einaudi, Franco (Technical Monitor)
2001-01-01
Understanding the effect of aerosol on cloud systems is one of the major challenges in atmospheric and climate research. Local studies suggest a multitude of influences on cloud properties. Yet the overall effect on cloud albedo, a critical parameter in climate simulations, remains uncertain. NASA's Triana mission will provide, from its EPIC multi-spectral imager, simultaneous data on aerosol properties and cloud reflectivity. With Triana's unique position in space these data will be available not only globally but also over the entire daytime, well suited to accommodate the often short lifetimes of aerosol and investigations around diurnal cycles. This pilot study explores the ability to detect relationships between aerosol properties and cloud reflectivity with sophisticated statistical methods. Sample results using data from the EOS Terra platform to simulate Triana are presented.
Composite load spectra for select space propulsion structural components
NASA Technical Reports Server (NTRS)
Newell, J. F.; Kurth, R. E.; Ho, H.
1991-01-01
The objective of this program is to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen posts and system ducting. The first approach will consist of using state of the art probabilistic methods to describe the individual loading conditions and combinations of these loading conditions to synthesize the composite load spectra simulation. The second approach will consist of developing coupled models for composite load spectra simulation which combine the deterministic models for composite load dynamic, acoustic, high pressure, and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients will then be determined using advanced probabilistic simulation methods with and without strategically selected experimental data.
A Statistical Study on Higher Educational Institutions in India
ERIC Educational Resources Information Center
Neelaveni, C.; Manimaran, S.
2014-01-01
This study aims to observe the increased effectiveness of Higher Educational Institutions in India and its competitiveness. It proposes to develop the interest in enhancing the quality in Educational Institutions. It is monitored and evaluated through rapid growth of information technology, which makes sophisticated data collection possible. This…
Societal Boundaries on Cybernetic Action or Decision-Making.
ERIC Educational Resources Information Center
Schulman, Rosalind; Steg, Doreen E.
This paper discusses the development, application, and implications of a statistical technique--a concordance index--for measuring the restrictions and constrictions (legal and societal) which inhibit individual decision making and adapting behavior. It was found that as sophistication sets in there will be less and less tolerance of these…
A Large-Scale Analysis of Variance in Written Language
ERIC Educational Resources Information Center
Johns, Brendan T.; Jamieson, Randall K.
2018-01-01
The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers,…
Student Ratings: The Validity of Use.
ERIC Educational Resources Information Center
McKeachie, Wilbert J.
1997-01-01
Concludes that there is concurrence on the validity of student ratings but that contextual variables affect the level of ratings. However, there is disagreement on the use of statistical corrections for such bias. The basic problem lies in the lack of sophistication of personnel committees who use the ratings. (MMU)
ERIC Educational Resources Information Center
Murakami, Akira
2016-01-01
This article introduces two sophisticated statistical modeling techniques that allow researchers to analyze systematicity, individual variation, and nonlinearity in second language (L2) development. Generalized linear mixed-effects models can be used to quantify individual variation and examine systematic effects simultaneously, and generalized…
Goldfarb, Samantha; Tarver, Will L; Sen, Bisakha
2014-01-01
Previous literature has asserted that family meals are a key protective factor for certain adolescent risk behaviors. It is suggested that the frequency of eating with the family is associated with better psychological well-being and a lower risk of substance use and delinquency. However, it is unclear whether there is evidence of causal links between family meals and adolescent health-risk behaviors. The purpose of this article is to review the empirical literature on family meals and adolescent health behaviors and outcomes in the US. A SEARCH WAS CONDUCTED IN FOUR ACADEMIC DATABASES: Social Sciences Full Text, Sociological Abstracts, PsycINFO®, and PubMed/MEDLINE. We included studies that quantitatively estimated the relationship between family meals and health-risk behaviors. Data were extracted on study sample, study design, family meal measurement, outcomes, empirical methods, findings, and major issues. Fourteen studies met the inclusion criteria for the review that measured the relationship between frequent family meals and various risk-behavior outcomes. The outcomes considered by most studies were alcohol use (n=10), tobacco use (n=9), and marijuana use (n=6). Other outcomes included sexual activity (n=2); depression, suicidal ideation, and suicide attempts (n=4); violence and delinquency (n=4); school-related issues (n=2); and well-being (n=5). The associations between family meals and the outcomes of interest were most likely to be statistically significant in unadjusted models or models controlling for basic family characteristics. Associations were less likely to be statistically significant when other measures of family connectedness were included. Relatively few analyses used sophisticated empirical techniques available to control for confounders in secondary data. More research is required to establish whether or not the relationship between family dinners and risky adolescent behaviors is an artifact of underlying confounders. We recommend that researchers make more frequent use of sophisticated methods to reduce the problem of confounders in secondary data, and that the scope of adolescent problem behaviors also be further widened.
NASA Astrophysics Data System (ADS)
Davids, J. C.; Rutten, M.; Van De Giesen, N.
2016-12-01
Hydrologic data has traditionally been collected with permanent installations of sophisticated and relatively accurate but expensive monitoring equipment at limited numbers of sites. Consequently, the spatial coverage of the data is limited and costs are high. Achieving adequate maintenance of sophisticated monitoring equipment often exceeds local technical and resource capacity, and permanently deployed monitoring equipment is susceptible to vandalism, theft, and other hazards. Rather than using expensive, vulnerable installations at a few points, SmartPhones4Water (S4W), a form of Citizen Hydrology, leverages widely available mobile technology to gather hydrologic data at many sites in a manner that is repeatable and scalable. However, there is currently a limited understanding of the impact of decreased observational frequency on the accuracy of key streamflow statistics like minimum flow, maximum flow, and runoff. As a first step towards evaluating the tradeoffs between traditional continuous monitoring approaches and emerging Citizen Hydrology methods, we randomly selected 50 active U.S. Geological Survey (USGS) streamflow gauges in California. We used historical 15 minute flow data from 01/01/2008 through 12/31/2014 to develop minimum flow, maximum flow, and runoff values (7 year total) for each gauge. In order to mimic lower frequency Citizen Hydrology observations, we developed a bootstrap randomized subsampling with replacement procedure. We calculated the same statistics, along with their respective distributions, from 50 subsample iterations with four different subsampling intervals (i.e. daily, three day, weekly, and monthly). Based on our results we conclude that, depending on the types of questions being asked, and the watershed characteristics, Citizen Hydrology streamflow measurements can provide useful and accurate information. Depending on watershed characteristics, minimum flows were reasonably estimated with subsample intervals ranging from daily to monthly. However, maximum flows in most cases were poorly characterized, even at daily subsample intervals. In general, runoff volumes were accurately estimated from daily, three day, weekly, and even in some cases, monthly observations.
Analysing attitude data through ridit schemes.
El-rouby, M G
1994-12-02
The attitudes of individuals and populations on various issues are usually assessed through sample surveys. Responses to survey questions are then scaled and combined into a meaningful whole which defines the measured attitude. The applied scales may be of nominal, ordinal, interval, or ratio nature depending upon the degree of sophistication the researcher wants to introduce into the measurement. This paper discusses methods of analysis for categorical variables of the type used in attitude and human behavior research, and recommends adoption of ridit analysis, a technique which has been successfully applied to epidemiological, clinical investigation, laboratory, and microbiological data. The ridit methodology is described after reviewing some general attitude scaling methods and problems of analysis related to them. The ridit method is then applied to a recent study conducted to assess health care service quality in North Carolina. This technique is conceptually and computationally more simple than other conventional statistical methods, and is also distribution-free. Basic requirements and limitations on its use are indicated.
NASA Astrophysics Data System (ADS)
Byrd, Gene G.; Byrd, Dana
2017-06-01
The two main purposes of this paper on improving Ay101 courses are presentations of (1) some very effective single changes and (2) a method to improve teaching by making just single changes which are evaluated statistically versus a control group class. We show how simple statistical comparison can be done even with Excel in Windows. Of course, other more sophisticated and powerful methods could be used if available. One of several examples to be discussed on our poster is our modification of an online introductory astronomy lab course evaluated by the multiple choice final exam. We composed questions related to the learning objectives of the course modules (LOQs). Students could “talk to themselves” by discursively answering these for extra credit prior to the final. Results were compared to an otherwise identical previous unmodified class. Modified classes showed statistically much better final exam average scores (78% vs. 66%). This modification helped those students who most need help. Students in the lower third of the class preferentially answered the LOQs to improve their scores and the class average on the exam. These results also show the effectiveness of relevant extra credit work. Other examples will be discussed as specific examples of evaluating improvement by making one change and then testing it versus a control. Essentially, this is an evolutionary approach in which single favorable “mutations” are retained and the unfavorable removed. The temptation to make more than one change each time must be resisted!
Changing Epistemological Beliefs: The Unexpected Impact of a Short-Term Intervention
ERIC Educational Resources Information Center
Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar
2008-01-01
Background: Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. Aims: To…
ERIC Educational Resources Information Center
Lee, Hyeon Woo
2011-01-01
As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…
Sun, Xing; Li, Xiaoyun; Chen, Cong; Song, Yang
2013-01-01
Frequent rise of interval-censored time-to-event data in randomized clinical trials (e.g., progression-free survival [PFS] in oncology) challenges statistical researchers in the pharmaceutical industry in various ways. These challenges exist in both trial design and data analysis. Conventional statistical methods treating intervals as fixed points, which are generally practiced by pharmaceutical industry, sometimes yield inferior or even flawed analysis results in extreme cases for interval-censored data. In this article, we examine the limitation of these standard methods under typical clinical trial settings and further review and compare several existing nonparametric likelihood-based methods for interval-censored data, methods that are more sophisticated but robust. Trial design issues involved with interval-censored data comprise another topic to be explored in this article. Unlike right-censored survival data, expected sample size or power for a trial with interval-censored data relies heavily on the parametric distribution of the baseline survival function as well as the frequency of assessments. There can be substantial power loss in trials with interval-censored data if the assessments are very infrequent. Such an additional dependency controverts many fundamental assumptions and principles in conventional survival trial designs, especially the group sequential design (e.g., the concept of information fraction). In this article, we discuss these fundamental changes and available tools to work around their impacts. Although progression-free survival is often used as a discussion point in the article, the general conclusions are equally applicable to other interval-censored time-to-event endpoints.
Optical turbulence forecast: ready for an operational application
NASA Astrophysics Data System (ADS)
Masciadri, E.; Lascaux, F.; Turchi, A.; Fini, L.
2017-04-01
One of the main goals of the feasibility study MOSE (MOdelling ESO Sites) is to evaluate the performances of a method conceived to forecast the optical turbulence (OT) above the European Southern Observatory (ESO) sites of the Very Large Telescope (VLT) and the European Extremely Large Telescope (E-ELT) in Chile. The method implied the use of a dedicated code conceived for the OT called ASTRO-MESO-NH. In this paper, we present results we obtained at conclusion of this project concerning the performances of this method in forecasting the most relevant parameters related to the OT (CN^2, seeing ɛ, isoplanatic angle θ0 and wavefront coherence time τ0). Numerical predictions related to a very rich statistical sample of nights uniformly distributed along a solar year and belonging to different years have been compared to observations, and different statistical operators have been analysed such as the classical bias, root-mean-squared error, σ and more sophisticated statistical operators derived by the contingency tables that are able to quantify the score of success of a predictive method such as the percentage of correct detection (PC) and the probability to detect a parameter within a specific range of values (POD). The main conclusions of the study tell us that the ASTRO-MESO-NH model provides performances that are already very good to definitely guarantee a not negligible positive impact on the service mode of top-class telescopes and ELTs. A demonstrator for an automatic and operational version of the ASTRO-MESO-NH model will be soon implemented on the sites of VLT and E-ELT.
What do results from coordinate-based meta-analyses tell us?
Albajes-Eizagirre, Anton; Radua, Joaquim
2018-08-01
Coordinate-based meta-analyses (CBMA) methods, such as Activation Likelihood Estimation (ALE) and Seed-based d Mapping (SDM), have become an invaluable tool for summarizing the findings of voxel-based neuroimaging studies. However, the progressive sophistication of these methods may have concealed two particularities of their statistical tests. Common univariate voxelwise tests (such as the t/z-tests used in SPM and FSL) detect voxels that activate, or voxels that show differences between groups. Conversely, the tests conducted in CBMA test for "spatial convergence" of findings, i.e., they detect regions where studies report "more peaks than in most regions", regions that activate "more than most regions do", or regions that show "larger differences between groups than most regions do". The first particularity is that these tests rely on two spatial assumptions (voxels are independent and have the same probability to have a "false" peak), whose violation may make their results either conservative or liberal, though fortunately current versions of ALE, SDM and some other methods consider these assumptions. The second particularity is that the use of these tests involves an important paradox: the statistical power to detect a given effect is higher if there are no other effects in the brain, whereas lower in presence of multiple effects. Copyright © 2018 Elsevier Inc. All rights reserved.
System Synthesis in Preliminary Aircraft Design using Statistical Methods
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.
1996-01-01
This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).
The Role of Salience in the Extraction of Algebraic Rules
ERIC Educational Resources Information Center
Endress, Ansgar D.; Scholl, Brian J.; Mehler, Jacques
2005-01-01
Recent research suggests that humans and other animals have sophisticated abilities to extract both statistical dependencies and rule-based regularities from sequences. Most of this research stresses the flexibility and generality of such processes. Here the authors take up an equally important project, namely, to explore the limits of such…
The Heat Is On! Using a Stylised Graph to Engender Understanding
ERIC Educational Resources Information Center
Fitzallen, Noleine; Watson, Jane; Wright, Suzie
2017-01-01
When working within a meaningful context quite young students are capable of working with sophisticated data. Year 3 students investigate thermal insulation and the transfer of heat in a STEM inquiry, developing skills in measuring temperature by conducting a statistical investigation, and using a stylised graph to interpret their data.
Technical Report of the NAEP 1992 Trial State Assessment Program in Mathematics.
ERIC Educational Resources Information Center
Johnson, Eugene G.; And Others
The "Nation's Report Card," the National Assessment of Educational Progress (NAEP), is the only nationally representative and continuing assessment of what America's students know and can do in various subject areas. This report summarizes some of the sophisticated statistical methodology used in the 1992 Trial State Assessment of…
NASA Astrophysics Data System (ADS)
Lieu, Richard
2018-01-01
A hierarchy of statistics of increasing sophistication and accuracy is proposed, to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level, with the help of high precision computers, to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this method of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the bolometric flux measurement of a radio source.
iTTVis: Interactive Visualization of Table Tennis Data.
Wu, Yingcai; Lan, Ji; Shu, Xinhuan; Ji, Chenyang; Zhao, Kejian; Wang, Jiachen; Zhang, Hui
2018-01-01
The rapid development of information technology paved the way for the recording of fine-grained data, such as stroke techniques and stroke placements, during a table tennis match. This data recording creates opportunities to analyze and evaluate matches from new perspectives. Nevertheless, the increasingly complex data poses a significant challenge to make sense of and gain insights into. Analysts usually employ tedious and cumbersome methods which are limited to watching videos and reading statistical tables. However, existing sports visualization methods cannot be applied to visualizing table tennis competitions due to different competition rules and particular data attributes. In this work, we collaborate with data analysts to understand and characterize the sophisticated domain problem of analysis of table tennis data. We propose iTTVis, a novel interactive table tennis visualization system, which to our knowledge, is the first visual analysis system for analyzing and exploring table tennis data. iTTVis provides a holistic visualization of an entire match from three main perspectives, namely, time-oriented, statistical, and tactical analyses. The proposed system with several well-coordinated views not only supports correlation identification through statistics and pattern detection of tactics with a score timeline but also allows cross analysis to gain insights. Data analysts have obtained several new insights by using iTTVis. The effectiveness and usability of the proposed system are demonstrated with four case studies.
Separation and confirmation of showers
NASA Astrophysics Data System (ADS)
Neslušan, L.; Hajduková, M.
2017-02-01
Aims: Using IAU MDC photographic, IAU MDC CAMS video, SonotaCo video, and EDMOND video databases, we aim to separate all provable annual meteor showers from each of these databases. We intend to reveal the problems inherent in this procedure and answer the question whether the databases are complete and the methods of separation used are reliable. We aim to evaluate the statistical significance of each separated shower. In this respect, we intend to give a list of reliably separated showers rather than a list of the maximum possible number of showers. Methods: To separate the showers, we simultaneously used two methods. The use of two methods enables us to compare their results, and this can indicate the reliability of the methods. To evaluate the statistical significance, we suggest a new method based on the ideas of the break-point method. Results: We give a compilation of the showers from all four databases using both methods. Using the first (second) method, we separated 107 (133) showers, which are in at least one of the databases used. These relatively low numbers are a consequence of discarding any candidate shower with a poor statistical significance. Most of the separated showers were identified as meteor showers from the IAU MDC list of all showers. Many of them were identified as several of the showers in the list. This proves that many showers have been named multiple times with different names. Conclusions: At present, a prevailing share of existing annual showers can be found in the data and confirmed when we use a combination of results from large databases. However, to gain a complete list of showers, we need more-complete meteor databases than the most extensive databases currently are. We also still need a more sophisticated method to separate showers and evaluate their statistical significance. Tables A.1 and A.2 are also available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A40
Comment on the asymptotics of a distribution-free goodness of fit test statistic.
Browne, Michael W; Shapiro, Alexander
2015-03-01
In a recent article Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed that a proof by Browne (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) of the asymptotic distribution of a goodness of fit test statistic is incomplete because it fails to prove that the orthogonal component function employed is continuous. Jennrich and Satorra (Psychometrika 78: 545-552, 2013) showed how Browne's proof can be completed satisfactorily but this required the development of an extensive and mathematically sophisticated framework for continuous orthogonal component functions. This short note provides a simple proof of the asymptotic distribution of Browne's (British Journal of Mathematical and Statistical Psychology 37: 62-83, 1984) test statistic by using an equivalent form of the statistic that does not involve orthogonal component functions and consequently avoids all complicating issues associated with them.
Political Trust and Sophistication: Taking Measurement Seriously.
Turper, Sedef; Aarts, Kees
2017-01-01
Political trust is an important indicator of political legitimacy. Hence, seemingly decreasing levels of political trust in Western democracies have stimulated a growing body of research on the causes and consequences of political trust. However, the neglect of potential measurement problems of political trust raises doubts about the findings of earlier studies. The current study revisits the measurement of political trust and re-examines the relationship between political trust and sophistication in the Netherlands by utilizing European Social Survey (ESS) data across five time points and four-wave panel data from the Panel Component of ESS. Our findings illustrate that high and low political sophistication groups display different levels of political trust even when measurement characteristics of political trust are taken into consideration. However, the relationship between political sophistication and political trust is weaker than it is often suggested by earlier research. Our findings also provide partial support for the argument that the gap between sophistication groups is widening over time. Furthermore, we demonstrate that, although the between-method differences between the latent means and the composite score means of political trust for high- and low sophistication groups are relatively minor, it is important to analyze the measurement characteristics of the political trust construct.
McDermott, Jason E.; Wang, Jing; Mitchell, Hugh; Webb-Robertson, Bobbie-Jo; Hafen, Ryan; Ramey, John; Rodland, Karin D.
2012-01-01
Introduction The advent of high throughput technologies capable of comprehensive analysis of genes, transcripts, proteins and other significant biological molecules has provided an unprecedented opportunity for the identification of molecular markers of disease processes. However, it has simultaneously complicated the problem of extracting meaningful molecular signatures of biological processes from these complex datasets. The process of biomarker discovery and characterization provides opportunities for more sophisticated approaches to integrating purely statistical and expert knowledge-based approaches. Areas covered In this review we will present examples of current practices for biomarker discovery from complex omic datasets and the challenges that have been encountered in deriving valid and useful signatures of disease. We will then present a high-level review of data-driven (statistical) and knowledge-based methods applied to biomarker discovery, highlighting some current efforts to combine the two distinct approaches. Expert opinion Effective, reproducible and objective tools for combining data-driven and knowledge-based approaches to identify predictive signatures of disease are key to future success in the biomarker field. We will describe our recommendations for possible approaches to this problem including metrics for the evaluation of biomarkers. PMID:23335946
Modeling Psychological Contract Violation using Dual Regime Models: An Event-based Approach
Hofmans, Joeri
2017-01-01
A good understanding of the dynamics of psychological contract violation requires theories, research methods and statistical models that explicitly recognize that violation feelings follow from an event that violates one's acceptance limits, after which interpretative processes are set into motion, determining the intensity of these violation feelings. Whereas theories—in the form of the dynamic model of the psychological contract—and research methods—in the form of daily diary research and experience sampling research—are available by now, the statistical tools to model such a two-stage process are still lacking. The aim of the present paper is to fill this gap in the literature by introducing two statistical models—the Zero-Inflated model and the Hurdle model—that closely mimic the theoretical process underlying the elicitation violation feelings via two model components: a binary distribution that models whether violation has occurred or not, and a count distribution that models how severe the negative impact is. Moreover, covariates can be included for both model components separately, which yields insight into their unique and shared antecedents. By doing this, the present paper offers a methodological-substantive synergy, showing how sophisticated methodology can be used to examine an important substantive issue. PMID:29163316
How to get statistically significant effects in any ERP experiment (and why you shouldn't).
Luck, Steven J; Gaspelin, Nicholas
2017-01-01
ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. © 2016 Society for Psychophysiological Research.
How to Get Statistically Significant Effects in Any ERP Experiment (and Why You Shouldn’t)
Luck, Steven J.; Gaspelin, Nicholas
2016-01-01
Event-related potential (ERP) experiments generate massive data sets, often containing thousands of values for each participant, even after averaging. The richness of these data sets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant-but-bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand average data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multi-factor statistical analyses. Re-analyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant-but-bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions. PMID:28000253
Otwombe, Kennedy N.; Petzold, Max; Martinson, Neil; Chirwa, Tobias
2014-01-01
Background Research in the predictors of all-cause mortality in HIV-infected people has widely been reported in literature. Making an informed decision requires understanding the methods used. Objectives We present a review on study designs, statistical methods and their appropriateness in original articles reporting on predictors of all-cause mortality in HIV-infected people between January 2002 and December 2011. Statistical methods were compared between 2002–2006 and 2007–2011. Time-to-event analysis techniques were considered appropriate. Data Sources Pubmed/Medline. Study Eligibility Criteria Original English-language articles were abstracted. Letters to the editor, editorials, reviews, systematic reviews, meta-analysis, case reports and any other ineligible articles were excluded. Results A total of 189 studies were identified (n = 91 in 2002–2006 and n = 98 in 2007–2011) out of which 130 (69%) were prospective and 56 (30%) were retrospective. One hundred and eighty-two (96%) studies described their sample using descriptive statistics while 32 (17%) made comparisons using t-tests. Kaplan-Meier methods for time-to-event analysis were commonly used in the earlier period (n = 69, 76% vs. n = 53, 54%, p = 0.002). Predictors of mortality in the two periods were commonly determined using Cox regression analysis (n = 67, 75% vs. n = 63, 64%, p = 0.12). Only 7 (4%) used advanced survival analysis methods of Cox regression analysis with frailty in which 6 (3%) were used in the later period. Thirty-two (17%) used logistic regression while 8 (4%) used other methods. There were significantly more articles from the first period using appropriate methods compared to the second (n = 80, 88% vs. n = 69, 70%, p-value = 0.003). Conclusion Descriptive statistics and survival analysis techniques remain the most common methods of analysis in publications on predictors of all-cause mortality in HIV-infected cohorts while prospective research designs are favoured. Sophisticated techniques of time-dependent Cox regression and Cox regression with frailty are scarce. This motivates for more training in the use of advanced time-to-event methods. PMID:24498313
The Statistical Consulting Center for Astronomy (SCCA)
NASA Technical Reports Server (NTRS)
Akritas, Michael
2001-01-01
The process by which raw astronomical data acquisition is transformed into scientifically meaningful results and interpretation typically involves many statistical steps. Traditional astronomy limits itself to a narrow range of old and familiar statistical methods: means and standard deviations; least-squares methods like chi(sup 2) minimization; and simple nonparametric procedures such as the Kolmogorov-Smirnov tests. These tools are often inadequate for the complex problems and datasets under investigations, and recent years have witnessed an increased usage of maximum-likelihood, survival analysis, multivariate analysis, wavelet and advanced time-series methods. The Statistical Consulting Center for Astronomy (SCCA) assisted astronomers with the use of sophisticated tools, and to match these tools with specific problems. The SCCA operated with two professors of statistics and a professor of astronomy working together. Questions were received by e-mail, and were discussed in detail with the questioner. Summaries of those questions and answers leading to new approaches were posted on the Web (www.state.psu.edu/ mga/SCCA). In addition to serving individual astronomers, the SCCA established a Web site for general use that provides hypertext links to selected on-line public-domain statistical software and services. The StatCodes site (www.astro.psu.edu/statcodes) provides over 200 links in the areas of: Bayesian statistics; censored and truncated data; correlation and regression, density estimation and smoothing, general statistics packages and information; image analysis; interactive Web tools; multivariate analysis; multivariate clustering and classification; nonparametric analysis; software written by astronomers; spatial statistics; statistical distributions; time series analysis; and visualization tools. StatCodes has received a remarkable high and constant hit rate of 250 hits/week (over 10,000/year) since its inception in mid-1997. It is of interest to scientists both within and outside of astronomy. The most popular sections are multivariate techniques, image analysis, and time series analysis. Hundreds of copies of the ASURV, SLOPES and CENS-TAU codes developed by SCCA scientists were also downloaded from the StatCodes site. In addition to formal SCCA duties, SCCA scientists continued a variety of related activities in astrostatistics, including refereeing of statistically oriented papers submitted to the Astrophysical Journal, talks in meetings including Feigelson's talk to science journalists entitled "The reemergence of astrostatistics" at the American Association for the Advancement of Science meeting, and published papers of astrostatistical content.
Fienup, Daniel M; Critchfield, Thomas S
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904
Astronautic Structures Manual, Volume 3
NASA Technical Reports Server (NTRS)
1975-01-01
This document (Volumes I, II, and III) presents a compilation of industry-wide methods in aerospace strength analysis that can be carried out by hand, that are general enough in scope to cover most structures encountered, and that are sophisticated enough to give accurate estimates of the actual strength expected. It provides analysis techniques for the elastic and inelastic stress ranges. It serves not only as a catalog of methods not usually available, but also as a reference source for the background of the methods themselves. An overview of the manual is as follows: Section A is a general introduction of methods used and includes sections on loads, combined stresses, and interaction curves; Section B is devoted to methods of strength analysis; Section C is devoted to the topic of structural stability; Section D is on thermal stresses; Section E is on fatigue and fracture mechanics; Section F is on composites; Section G is on rotating machinery; and Section H is on statistics. These three volumes supersede Volumes I and II, NASA TM X-60041 and NASA TM X-60042, respectively.
Science as Storytelling for Teaching the Nature of Science and the Science-Religion Interface
ERIC Educational Resources Information Center
Bickmore, Barry R.; Thompson, Kirsten R.; Grandy, David A.; Tomlin, Teagan
2009-01-01
Here we describe a method for teaching the NOS called "Science as Storytelling," which was designed to directly confront narve realist preconceptions about the NOS and replace them with more sophisticated ideas, while retaining a moderate realist perspective. It was also designed to foster a more sophisticated understanding of the…
ERIC Educational Resources Information Center
Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.
2010-01-01
The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…
ERIC Educational Resources Information Center
Ammentorp, William
There is much to be gained by using systems analysis in educational administration. Most administrators, presently relying on classical statistical techniques restricted to problems having few variables, should be trained to use more sophisticated tools such as systems analysis. The systems analyst, interested in the basic processes of a group or…
Commentary: Gene by Environment Interplay and Psychopathology--In Search of a Paradigm
ERIC Educational Resources Information Center
Nigg, Joel T.
2013-01-01
The articles in this Special Issue (SI) extend research on G×E in multiple ways, showing the growing importance of specifying kinds of G×E models (e.g., bioecological, susceptibility, stress-diathesis), incorporation of sophisticated ways of measuring types of G×E correlations (rGE), checking effects of statistical artifact, exemplifying an…
Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5
ERIC Educational Resources Information Center
Candiotto, Laura
2018-01-01
This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…
Genetic Control of Meat Quality Traits
NASA Astrophysics Data System (ADS)
Williams, John L.
Meat was originally produced from non-specialized animals that were used for a variety of purposes, in addition to being a source of food. However, selective breeding has resulted in “improved” breeds of cattle that are now used to produce either milk or beef, and specialized chicken lines that produce eggs or meat. These improved breeds are very productive under appropriate management systems. The selection methods used to create these specialized breeds were based on easily measured phenotypic variations, such as growth rate or physical size. Improvement in the desired trait was achieved by breeding directly from animals displaying the desired phenotype. However, more recently sophisticated genetic models have been developed using statistical approaches that consider phenotypic information collected, not only from individual animals but also from their parents, sibs, and progeny.
Analytical methods development for supramolecular design in solar hydrogen production
NASA Astrophysics Data System (ADS)
Brown, J. R.; Elvington, M.; Mongelli, M. T.; Zigler, D. F.; Brewer, K. J.
2006-08-01
In the investigation of alternative energy sources, specifically, solar hydrogen production from water, the ability to perform experiments with a consistent and reproducible light source is key to meaningful photochemistry. The design, construction, and evaluation of a series of LED array photolysis systems for high throughput photochemistry have been performed. Three array systems of increasing sophistication are evaluated using calorimetric measurements and potassium tris(oxalato)ferrate(II) chemical actinometry and compared with a traditional 1000 W Xe arc lamp source. The results are analyzed using descriptive statistics and analysis of variance (ANOVA). The third generation array is modular, and controllable in design. Furthermore, the third generation array system is shown to be comparable in both precision and photonic output to a 1000 W Xe arc lamp.
de Andrade, Jucimara Kulek; de Andrade, Camila Kulek; Komatsu, Emy; Perreault, Hélène; Torres, Yohandra Reyes; da Rosa, Marcos Roberto; Felsner, Maria Lurdes
2017-08-01
Corn syrups, important ingredients used in food and beverage industries, often contain high levels of 5-hydroxymethyl-2-furfural (HMF), a toxic contaminant. In this work, an in house validation of a difference spectrophotometric method for HMF analysis in corn syrups was developed using sophisticated statistical tools by the first time. The methodology showed excellent analytical performance with good selectivity, linearity (R 2 =99.9%, r>0.99), accuracy and low limits (LOD=0.10mgL -1 and LOQ=0.34mgL -1 ). An excellent precision was confirmed by repeatability (RSD (%)=0.30) and intermediate precision (RSD (%)=0.36) estimates and by Horrat value (0.07). A detailed study of method precision using a nested design demonstrated that variation sources such as instruments, operators and time did not interfere in the variability of results within laboratory and consequently in its intermediate precision. The developed method is environmentally friendly, fast, cheap and easy to implement resulting in an attractive alternative for corn syrups quality control in industries and official laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.
Limited data tomographic image reconstruction via dual formulation of total variation minimization
NASA Astrophysics Data System (ADS)
Jang, Kwang Eun; Sung, Younghun; Lee, Kangeui; Lee, Jongha; Cho, Seungryong
2011-03-01
The X-ray mammography is the primary imaging modality for breast cancer screening. For the dense breast, however, the mammogram is usually difficult to read due to tissue overlap problem caused by the superposition of normal tissues. The digital breast tomosynthesis (DBT) that measures several low dose projections over a limited angle range may be an alternative modality for breast imaging, since it allows the visualization of the cross-sectional information of breast. The DBT, however, may suffer from the aliasing artifact and the severe noise corruption. To overcome these problems, a total variation (TV) regularized statistical reconstruction algorithm is presented. Inspired by the dual formulation of TV minimization in denoising and deblurring problems, we derived a gradient-type algorithm based on statistical model of X-ray tomography. The objective function is comprised of a data fidelity term derived from the statistical model and a TV regularization term. The gradient of the objective function can be easily calculated using simple operations in terms of auxiliary variables. After a descending step, the data fidelity term is renewed in each iteration. Since the proposed algorithm can be implemented without sophisticated operations such as matrix inverse, it provides an efficient way to include the TV regularization in the statistical reconstruction method, which results in a fast and robust estimation for low dose projections over the limited angle range. Initial tests with an experimental DBT system confirmed our finding.
NASA Astrophysics Data System (ADS)
Zack, J. W.
2015-12-01
Predictions from Numerical Weather Prediction (NWP) models are the foundation for wind power forecasts for day-ahead and longer forecast horizons. The NWP models directly produce three-dimensional wind forecasts on their respective computational grids. These can be interpolated to the location and time of interest. However, these direct predictions typically contain significant systematic errors ("biases"). This is due to a variety of factors including the limited space-time resolution of the NWP models and shortcomings in the model's representation of physical processes. It has become common practice to attempt to improve the raw NWP forecasts by statistically adjusting them through a procedure that is widely known as Model Output Statistics (MOS). The challenge is to identify complex patterns of systematic errors and then use this knowledge to adjust the NWP predictions. The MOS-based improvements are the basis for much of the value added by commercial wind power forecast providers. There are an enormous number of statistical approaches that can be used to generate the MOS adjustments to the raw NWP forecasts. In order to obtain insight into the potential value of some of the newer and more sophisticated statistical techniques often referred to as "machine learning methods" a MOS-method comparison experiment has been performed for wind power generation facilities in 6 wind resource areas of California. The underlying NWP models that provided the raw forecasts were the two primary operational models of the US National Weather Service: the GFS and NAM models. The focus was on 1- and 2-day ahead forecasts of the hourly wind-based generation. The statistical methods evaluated included: (1) screening multiple linear regression, which served as a baseline method, (2) artificial neural networks, (3) a decision-tree approach called random forests, (4) gradient boosted regression based upon an decision-tree algorithm, (5) support vector regression and (6) analog ensemble, which is a case-matching scheme. The presentation will provide (1) an overview of each method and the experimental design, (2) performance comparisons based on standard metrics such as bias, MAE and RMSE, (3) a summary of the performance characteristics of each approach and (4) a preview of further experiments to be conducted.
NASA Astrophysics Data System (ADS)
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods that have been developed in the statistical physics community over the last few decades. We demonstrate that such methods, along with automated differentiation algorithms, allow us to perform a full-fledged Bayesian inference, for a large class of SDE models, in a highly efficient and largely automatized manner. Furthermore, our algorithm is highly parallelizable. For our toy model, discretized with a few hundred points, a full Bayesian inference can be performed in a matter of seconds on a standard PC.
NASA Astrophysics Data System (ADS)
Davids, Jeffrey; Rutten, Martine; van de Giesen, Nick; Mehl, Steffen; Norris, James
2016-04-01
Traditional approaches to hydrologic data collection rely on permanent installations of sophisticated and relatively accurate but expensive monitoring equipment at limited numbers of sites. Consequently, the spatial coverage of the data is limited and the cost is high. Moreover, achieving adequate maintenance of the sophisticated equipment often exceeds local technical and resource capacity, and experience has shown that permanently deployed monitoring equipment is susceptible to vandalism, theft, and other hazards. Rather than using expensive, vulnerable installations at a few points, SmartPhones4Water (S4W), a form of citizen science, leverages widely available mobile technology to gather hydrologic data at many sites in a manner that is highly repeatable and scalable. The tradeoff for increased spatial resolution, however, is reduced observation frequency. As a first step towards evaluating the tradeoffs between the traditional continuous monitoring approach and emerging citizen science methods, 50 U.S. Geological Survey (USGS) streamflow gages were randomly selected from the population of roughly 350 USGS gages operated in California. Gaging station metadata and historical 15 minute flow data for the period from 01/10/2007 through 31/12/2014 were compiled for each of the selected gages. Historical 15 minute flow data were then used to develop daily, monthly, and yearly determinations of average, minimum, maximum streamflow, cumulative runoff, and streamflow distribution. These statistics were then compared to similar statistics developed from randomly selected daily and weekly spot measurements of streamflow. Cumulative runoff calculated from daily and weekly observations were within 10 percent of actual runoff calculated from 15 minute data for 75 percent and 46 percent of sites respectively. As anticipated, larger watersheds with less dynamic temporal variability compared more favorably for all statistics evaluated than smaller watersheds. Based on the results of these analyses it appears that, in certain circumstances, citizen science based observations of hydrologic data can provide sufficiently reliable information for both real-time management and water resources planning purposes. To further evaluate the merits of citizen science methodologies, S4W is launching field pilot projects in Nepal.
Muselík, Jan; Franc, Aleš; Doležel, Petr; Goněc, Roman; Krondlová, Anna; Lukášová, Ivana
2014-09-01
The article describes the development and production of tablets using direct compression of powder mixtures. The aim was to describe the impact of filler particle size and the time of lubricant addition during mixing on content uniformity according to the Good Manufacturing Practice (GMP) process validation requirements. Processes are regulated by complex directives, forcing the producers to validate, using sophisticated methods, the content uniformity of intermediates as well as final products. Cutting down of production time and material, shortening of analyses, and fast and reliable statistic evaluation of results can reduce the final price without affecting product quality. The manufacturing process of directly compressed tablets containing the low dose active pharmaceutical ingredient (API) warfarin, with content uniformity passing validation criteria, is used as a model example. Statistic methods have proved that the manufacturing process is reproducible. Methods suitable for elucidation of various properties of the final blend, e.g., measurement of electrostatic charge by Faraday pail and evaluation of mutual influences of researched variables by partial least square (PLS) regression, were used. Using these methods, it was proved that the filler with higher particle size increased the content uniformity of both blends and the ensuing tablets. Addition of the lubricant, magnesium stearate, during the blending process improved the content uniformity of blends containing the filler with larger particles. This seems to be caused by reduced sampling error due to the suppression of electrostatic charge.
Goldfarb, Samantha; Tarver, Will L; Sen, Bisakha
2014-01-01
Background Previous literature has asserted that family meals are a key protective factor for certain adolescent risk behaviors. It is suggested that the frequency of eating with the family is associated with better psychological well-being and a lower risk of substance use and delinquency. However, it is unclear whether there is evidence of causal links between family meals and adolescent health-risk behaviors. Purpose The purpose of this article is to review the empirical literature on family meals and adolescent health behaviors and outcomes in the US. Data sources A search was conducted in four academic databases: Social Sciences Full Text, Sociological Abstracts, PsycINFO®, and PubMed/MEDLINE. Study selection We included studies that quantitatively estimated the relationship between family meals and health-risk behaviors. Data extraction Data were extracted on study sample, study design, family meal measurement, outcomes, empirical methods, findings, and major issues. Data synthesis Fourteen studies met the inclusion criteria for the review that measured the relationship between frequent family meals and various risk-behavior outcomes. The outcomes considered by most studies were alcohol use (n=10), tobacco use (n=9), and marijuana use (n=6). Other outcomes included sexual activity (n=2); depression, suicidal ideation, and suicide attempts (n=4); violence and delinquency (n=4); school-related issues (n=2); and well-being (n=5). The associations between family meals and the outcomes of interest were most likely to be statistically significant in unadjusted models or models controlling for basic family characteristics. Associations were less likely to be statistically significant when other measures of family connectedness were included. Relatively few analyses used sophisticated empirical techniques available to control for confounders in secondary data. Conclusion More research is required to establish whether or not the relationship between family dinners and risky adolescent behaviors is an artifact of underlying confounders. We recommend that researchers make more frequent use of sophisticated methods to reduce the problem of confounders in secondary data, and that the scope of adolescent problem behaviors also be further widened. PMID:24627645
The Matrix Element Method: Past, Present, and Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.
2013-07-12
The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less
A new JPEG-based steganographic algorithm for mobile devices
NASA Astrophysics Data System (ADS)
Agaian, Sos S.; Cherukuri, Ravindranath C.; Schneider, Erik C.; White, Gregory B.
2006-05-01
Currently, cellular phones constitute a significant portion of the global telecommunications market. Modern cellular phones offer sophisticated features such as Internet access, on-board cameras, and expandable memory which provide these devices with excellent multimedia capabilities. Because of the high volume of cellular traffic, as well as the ability of these devices to transmit nearly all forms of data. The need for an increased level of security in wireless communications is becoming a growing concern. Steganography could provide a solution to this important problem. In this article, we present a new algorithm for JPEG-compressed images which is applicable to mobile platforms. This algorithm embeds sensitive information into quantized discrete cosine transform coefficients obtained from the cover JPEG. These coefficients are rearranged based on certain statistical properties and the inherent processing and memory constraints of mobile devices. Based on the energy variation and block characteristics of the cover image, the sensitive data is hidden by using a switching embedding technique proposed in this article. The proposed system offers high capacity while simultaneously withstanding visual and statistical attacks. Based on simulation results, the proposed method demonstrates an improved retention of first-order statistics when compared to existing JPEG-based steganographic algorithms, while maintaining a capacity which is comparable to F5 for certain cover images.
Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data.
Kim, Sehwi; Jung, Inkyung
2017-01-01
The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns.
Optimizing the maximum reported cluster size in the spatial scan statistic for ordinal data
Kim, Sehwi
2017-01-01
The spatial scan statistic is an important tool for spatial cluster detection. There have been numerous studies on scanning window shapes. However, little research has been done on the maximum scanning window size or maximum reported cluster size. Recently, Han et al. proposed to use the Gini coefficient to optimize the maximum reported cluster size. However, the method has been developed and evaluated only for the Poisson model. We adopt the Gini coefficient to be applicable to the spatial scan statistic for ordinal data to determine the optimal maximum reported cluster size. Through a simulation study and application to a real data example, we evaluate the performance of the proposed approach. With some sophisticated modification, the Gini coefficient can be effectively employed for the ordinal model. The Gini coefficient most often picked the optimal maximum reported cluster sizes that were the same as or smaller than the true cluster sizes with very high accuracy. It seems that we can obtain a more refined collection of clusters by using the Gini coefficient. The Gini coefficient developed specifically for the ordinal model can be useful for optimizing the maximum reported cluster size for ordinal data and helpful for properly and informatively discovering cluster patterns. PMID:28753674
Wang, Hongkai; Stout, David B; Chatziioannou, Arion F
2013-05-01
The development of sophisticated and high throughput whole body small animal imaging technologies has created a need for improved image analysis and increased automation. The registration of a digital mouse atlas to individual images is a prerequisite for automated organ segmentation and uptake quantification. This paper presents a fully-automatic method for registering a statistical mouse atlas with individual subjects based on an anterior-posterior X-ray projection and a lateral optical photo of the mouse silhouette. The mouse atlas was trained as a statistical shape model based on 83 organ-segmented micro-CT images. For registration, a hierarchical approach is applied which first registers high contrast organs, and then estimates low contrast organs based on the registered high contrast organs. To register the high contrast organs, a 2D-registration-back-projection strategy is used that deforms the 3D atlas based on the 2D registrations of the atlas projections. For validation, this method was evaluated using 55 subjects of preclinical mouse studies. The results showed that this method can compensate for moderate variations of animal postures and organ anatomy. Two different metrics, the Dice coefficient and the average surface distance, were used to assess the registration accuracy of major organs. The Dice coefficients vary from 0.31 ± 0.16 for the spleen to 0.88 ± 0.03 for the whole body, and the average surface distance varies from 0.54 ± 0.06 mm for the lungs to 0.85 ± 0.10mm for the skin. The method was compared with a direct 3D deformation optimization (without 2D-registration-back-projection) and a single-subject atlas registration (instead of using the statistical atlas). The comparison revealed that the 2D-registration-back-projection strategy significantly improved the registration accuracy, and the use of the statistical mouse atlas led to more plausible organ shapes than the single-subject atlas. This method was also tested with shoulder xenograft tumor-bearing mice, and the results showed that the registration accuracy of most organs was not significantly affected by the presence of shoulder tumors, except for the lungs and the spleen. Copyright © 2013 Elsevier B.V. All rights reserved.
Decomposition of the Inequality of Income Distribution by Income Types—Application for Romania
NASA Astrophysics Data System (ADS)
Andrei, Tudorel; Oancea, Bogdan; Richmond, Peter; Dhesi, Gurjeet; Herteliu, Claudiu
2017-09-01
This paper identifies the salient factors that characterize the inequality income distribution for Romania. Data analysis is rigorously carried out using sophisticated techniques borrowed from classical statistics (Theil). Decomposition of the inequalities measured by the Theil index is also performed. This study relies on an exhaustive (11.1 million records for 2014) data-set for total personal gross income of Romanian citizens.
NASA Technical Reports Server (NTRS)
Newchurch, Michael J.; Cunnold, Derek M.; Zawodny, Joseph M.
2000-01-01
The objective of this research project is: to calculate ozone trends in the stratosphere from Dobson Umkehr measurements, to determine the vertical profile of trends at Arosa by using a sophisticated statistical model (MARCH) to separate solar, aerosol, and QBO effects on Dobson Umkehr measurements, and to compare Umkehr trends with SBUV and SAGE I/II trends in the stratosphere.
ERIC Educational Resources Information Center
Jacob, Brian A.
2016-01-01
Contrary to popular belief, modern cognitive assessments--including the new Common Core tests--produce test scores based on sophisticated statistical models rather than the simple percent of items a student answers correctly. While there are good reasons for this, it means that reported test scores depend on many decisions made by test designers,…
The current and future status of the concealed information test for field use.
Matsuda, Izumi; Nittono, Hiroshi; Allen, John J B
2012-01-01
The Concealed Information Test (CIT) is a psychophysiological technique for examining whether a person has knowledge of crime-relevant information. Many laboratory studies have shown that the CIT has good scientific validity. However, the CIT has seldom been used for actual criminal investigations. One successful exception is its use by the Japanese police. In Japan, the CIT has been widely used for criminal investigations, although its probative force in court is not strong. In this paper, we first review the current use of the field CIT in Japan. Then, we discuss two possible approaches to increase its probative force: sophisticated statistical judgment methods and combining new psychophysiological measures with classic autonomic measures. On the basis of these considerations, we propose several suggestions for future practice and research involving the field CIT.
The Current and Future Status of the Concealed Information Test for Field Use
Matsuda, Izumi; Nittono, Hiroshi; Allen, John J. B.
2012-01-01
The Concealed Information Test (CIT) is a psychophysiological technique for examining whether a person has knowledge of crime-relevant information. Many laboratory studies have shown that the CIT has good scientific validity. However, the CIT has seldom been used for actual criminal investigations. One successful exception is its use by the Japanese police. In Japan, the CIT has been widely used for criminal investigations, although its probative force in court is not strong. In this paper, we first review the current use of the field CIT in Japan. Then, we discuss two possible approaches to increase its probative force: sophisticated statistical judgment methods and combining new psychophysiological measures with classic autonomic measures. On the basis of these considerations, we propose several suggestions for future practice and research involving the field CIT. PMID:23205018
[The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].
Carinci, F
2009-01-01
Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.
Job attitudes, job satisfaction, and job affect: A century of continuity and of change.
Judge, Timothy A; Weiss, Howard M; Kammeyer-Mueller, John D; Hulin, Charles L
2017-03-01
Over the past 100 years, research on job attitudes has improved in the sophistication of methods and in the productive use of theory as a basis for fundamental research into questions of work psychology. Early research incorporated a diversity of methods for measuring potential predictors and outcomes of job attitudes. Over time, methods for statistically assessing these relationships became more rigorous, but the field also became narrower. In recent years, developments in theory and methodology have reinvigorated research, which now addresses a rich panoply of topics related to the daily flow of affect, the complexity of personal motives and dispositions, and the complex interplay of attitude objects and motivation in shaping behavior. Despite these apparent changes, a review of the concepts and substantive arguments that underpin this literature have remained remarkably consistent. We conclude by discussing how we expect that these major themes will be addressed in the future, emphasizing topics that have proven to be enduring guides for understanding the ways that people construe and react to their appraisals of their work. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
DFLOWZ: A free program to evaluate the area potentially inundated by a debris flow
NASA Astrophysics Data System (ADS)
Berti, M.; Simoni, A.
2014-06-01
The transport and deposition mechanisms of debris flows are still poorly understood due to the complexity of the interactions governing the behavior of water-sediment mixtures. Empirical-statistical methods can therefore be used, instead of more sophisticated numerical methods, to predict the depositional behavior of these highly dangerous gravitational movements. We use widely accepted semi-empirical scaling relations and propose an automated procedure (DFLOWZ) to estimate the area potentially inundated by a debris flow event. Beside a digital elevation model (DEM), the procedure has only two input requirements: the debris flow volume and the possible flow-path. The procedure is implemented in Matlab and a Graphical User Interface helps to visualize initial conditions, flow propagation and final results. Different hypothesis about the depositional behavior of an event can be tested together with the possible effect of simple remedial measures. Uncertainties associated to scaling relations can be treated and their impact on results evaluated. Our freeware application aims to facilitate and speed up the process of susceptibility mapping. We discuss limits and advantages of the method in order to inform inexperienced users.
Kretschmer, V
1987-09-01
On the basis of a survey, the acute side-effects and technical problems in a total of 77,525 cytaphereses (IFC 36,530, CFC 40,995) in donors at 39 hemapheresis centers were retrospectively analysed statistically. In general, relevant donor side-effects (0.78%-1.05%) were more rare than the primary donor-independent disturbances (1.65%-2.63%). The donor side-effects predominated merely with the use of the cell separators Haemonetics M30/Belco (1.06% vs. 0.57%). These were mainly circulatory reactions (0.83%), which were generally much more frequent with IFC (0.54%) than with CFC (IBM/Cobe 0.11%, CS-3000 0.19%). Potentially fatal complications were not reported. The frequency of side-effects, disturbances and discontinuations correlated inversely with the separation rate of the individual centers per method. Centers in which two or three methods were applied simultaneously reported a higher frequency of side-effects and disturbances. Hemolysis was only observed with IFC (0.09%), but not with the use of the Haemonetics V50. The greater susceptibility to disturbances of technical/methodological/operational origin essentially results from the more elaborate, but not yet perfected technology, including computer control and monitoring, as well as defects in the production of the much more complicated disposable sets. Thus the highest rate of discontinuations was calculated for the system which is so far the most sophisticated technically (CS-3000, 1.85%). Although the primary donor-independent problems sometimes correlate directly with the manifestation of donor side-effects, the greater technological sophistication of automatically controlled and monitored systems cannot be dispensed with, since only in this way can potentially fatal risks for the donors be largely ruled out.(ABSTRACT TRUNCATED AT 250 WORDS)
Diving deeper into Zebrafish development of social behavior: analyzing high resolution data.
Buske, Christine; Gerlai, Robert
2014-08-30
Vertebrate model organisms have been utilized in high throughput screening but only with substantial cost and human capital investment. The zebrafish is a vertebrate model species that is a promising and cost effective candidate for efficient high throughput screening. Larval zebrafish have already been successfully employed in this regard (Lessman, 2011), but adult zebrafish also show great promise. High throughput screening requires the use of a large number of subjects and collection of substantial amount of data. Collection of data is only one of the demanding aspects of screening. However, in most screening approaches that involve behavioral data the main bottleneck that slows throughput is the time consuming aspect of analysis of the collected data. Some automated analytical tools do exist, but often they only work for one subject at a time, eliminating the possibility of fully utilizing zebrafish as a screening tool. This is a particularly important limitation for such complex phenotypes as social behavior. Testing multiple fish at a time can reveal complex social interactions but it may also allow the identification of outliers from a group of mutagenized or pharmacologically treated fish. Here, we describe a novel method using a custom software tool developed within our laboratory, which enables tracking multiple fish, in combination with a sophisticated analytical approach for summarizing and analyzing high resolution behavioral data. This paper focuses on the latter, the analytic tool, which we have developed using the R programming language and environment for statistical computing. We argue that combining sophisticated data collection methods with appropriate analytical tools will propel zebrafish into the future of neurobehavioral genetic research. Copyright © 2014. Published by Elsevier B.V.
Statistical assessment of the learning curves of health technologies.
Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T
2001-01-01
(1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)
The clinical value of large neuroimaging data sets in Alzheimer's disease.
Toga, Arthur W
2012-02-01
Rapid advances in neuroimaging and cyberinfrastructure technologies have brought explosive growth in the Web-based warehousing, availability, and accessibility of imaging data on a variety of neurodegenerative and neuropsychiatric disorders and conditions. There has been a prolific development and emergence of complex computational infrastructures that serve as repositories of databases and provide critical functionalities such as sophisticated image analysis algorithm pipelines and powerful three-dimensional visualization and statistical tools. The statistical and operational advantages of collaborative, distributed team science in the form of multisite consortia push this approach in a diverse range of population-based investigations. Copyright © 2012 Elsevier Inc. All rights reserved.
Multicharged and/or water-soluble fluorescent dendrimers: properties and uses.
Caminade, Anne-Marie; Hameau, Aurélien; Majoral, Jean-Pierre
2009-09-21
The fluorescence of water-soluble dendritic compounds can be due to the whole structure or to fluorophores used as core, as peripheral groups, or as branches. Highly sophisticated precisely defined structures with other functional groups usable for material or biological purposes have been synthesised, but many recent examples have shown that dendrimers can be used as versatile platforms for statistically linking various types of functional groups.
When Machines Think: Radiology's Next Frontier.
Dreyer, Keith J; Geis, J Raymond
2017-12-01
Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.
NASA Astrophysics Data System (ADS)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-01
This study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Second, using a newly developed proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ˜ 2°, than those from the three empirical models with averaged errors > ˜ 5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. This study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.
Chen, Yue; Cunningham, Gregory; Henderson, Michael
2016-09-21
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yue; Cunningham, Gregory; Henderson, Michael
Our study aims to statistically estimate the errors in local magnetic field directions that are derived from electron directional distributions measured by Los Alamos National Laboratory geosynchronous (LANL GEO) satellites. First, by comparing derived and measured magnetic field directions along the GEO orbit to those calculated from three selected empirical global magnetic field models (including a static Olson and Pfitzer 1977 quiet magnetic field model, a simple dynamic Tsyganenko 1989 model, and a sophisticated dynamic Tsyganenko 2001 storm model), it is shown that the errors in both derived and modeled directions are at least comparable. Furthermore, using a newly developedmore » proxy method as well as comparing results from empirical models, we are able to provide for the first time circumstantial evidence showing that derived magnetic field directions should statistically match the real magnetic directions better, with averaged errors < ~2°, than those from the three empirical models with averaged errors > ~5°. In addition, our results suggest that the errors in derived magnetic field directions do not depend much on magnetospheric activity, in contrast to the empirical field models. Finally, as applications of the above conclusions, we show examples of electron pitch angle distributions observed by LANL GEO and also take the derived magnetic field directions as the real ones so as to test the performance of empirical field models along the GEO orbits, with results suggesting dependence on solar cycles as well as satellite locations. Finally, this study demonstrates the validity and value of the method that infers local magnetic field directions from particle spin-resolved distributions.« less
Advances in high-resolution mass spectrometry based on metabolomics studies for food--a review.
Rubert, Josep; Zachariasova, Milena; Hajslova, Jana
2015-01-01
Food authenticity becomes a necessity for global food policies, since food placed in the market without fail has to be authentic. It has always been a challenge, since in the past minor components, called also markers, have been mainly monitored by chromatographic methods in order to authenticate the food. Nevertheless, nowadays, advanced analytical methods have allowed food fingerprints to be achieved. At the same time they have been also combined with chemometrics, which uses statistical methods in order to verify food and to provide maximum information by analysing chemical data. These sophisticated methods based on different separation techniques or stand alone have been recently coupled to high-resolution mass spectrometry (HRMS) in order to verify the authenticity of food. The new generation of HRMS detectors have experienced significant advances in resolving power, sensitivity, robustness, extended dynamic range, easier mass calibration and tandem mass capabilities, making HRMS more attractive and useful to the food metabolomics community, therefore becoming a reliable tool for food authenticity. The purpose of this review is to summarise and describe the most recent metabolomics approaches in the area of food metabolomics, and to discuss the strengths and drawbacks of the HRMS analytical platforms combined with chemometrics.
PyEvolve: a toolkit for statistical modelling of molecular evolution.
Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A
2004-01-05
Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.
Error Modelling for Multi-Sensor Measurements in Infrastructure-Free Indoor Navigation
Ruotsalainen, Laura; Kirkko-Jaakkola, Martti; Rantanen, Jesperi; Mäkelä, Maija
2018-01-01
The long-term objective of our research is to develop a method for infrastructure-free simultaneous localization and mapping (SLAM) and context recognition for tactical situational awareness. Localization will be realized by propagating motion measurements obtained using a monocular camera, a foot-mounted Inertial Measurement Unit (IMU), sonar, and a barometer. Due to the size and weight requirements set by tactical applications, Micro-Electro-Mechanical (MEMS) sensors will be used. However, MEMS sensors suffer from biases and drift errors that may substantially decrease the position accuracy. Therefore, sophisticated error modelling and implementation of integration algorithms are key for providing a viable result. Algorithms used for multi-sensor fusion have traditionally been different versions of Kalman filters. However, Kalman filters are based on the assumptions that the state propagation and measurement models are linear with additive Gaussian noise. Neither of the assumptions is correct for tactical applications, especially for dismounted soldiers, or rescue personnel. Therefore, error modelling and implementation of advanced fusion algorithms are essential for providing a viable result. Our approach is to use particle filtering (PF), which is a sophisticated option for integrating measurements emerging from pedestrian motion having non-Gaussian error characteristics. This paper discusses the statistical modelling of the measurement errors from inertial sensors and vision based heading and translation measurements to include the correct error probability density functions (pdf) in the particle filter implementation. Then, model fitting is used to verify the pdfs of the measurement errors. Based on the deduced error models of the measurements, particle filtering method is developed to fuse all this information, where the weights of each particle are computed based on the specific models derived. The performance of the developed method is tested via two experiments, one at a university’s premises and another in realistic tactical conditions. The results show significant improvement on the horizontal localization when the measurement errors are carefully modelled and their inclusion into the particle filtering implementation correctly realized. PMID:29443918
Of bugs and birds: Markov Chain Monte Carlo for hierarchical modeling in wildlife research
Link, W.A.; Cam, E.; Nichols, J.D.; Cooch, E.G.
2002-01-01
Markov chain Monte Carlo (MCMC) is a statistical innovation that allows researchers to fit far more complex models to data than is feasible using conventional methods. Despite its widespread use in a variety of scientific fields, MCMC appears to be underutilized in wildlife applications. This may be due to a misconception that MCMC requires the adoption of a subjective Bayesian analysis, or perhaps simply to its lack of familiarity among wildlife researchers. We introduce the basic ideas of MCMC and software BUGS (Bayesian inference using Gibbs sampling), stressing that a simple and satisfactory intuition for MCMC does not require extraordinary mathematical sophistication. We illustrate the use of MCMC with an analysis of the association between latent factors governing individual heterogeneity in breeding and survival rates of kittiwakes (Rissa tridactyla). We conclude with a discussion of the importance of individual heterogeneity for understanding population dynamics and designing management plans.
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update
Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy
2016-01-01
High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889
Stewart, Gavin B.; Altman, Douglas G.; Askie, Lisa M.; Duley, Lelia; Simmonds, Mark C.; Stewart, Lesley A.
2012-01-01
Background Individual participant data (IPD) meta-analyses that obtain “raw” data from studies rather than summary data typically adopt a “two-stage” approach to analysis whereby IPD within trials generate summary measures, which are combined using standard meta-analytical methods. Recently, a range of “one-stage” approaches which combine all individual participant data in a single meta-analysis have been suggested as providing a more powerful and flexible approach. However, they are more complex to implement and require statistical support. This study uses a dataset to compare “two-stage” and “one-stage” models of varying complexity, to ascertain whether results obtained from the approaches differ in a clinically meaningful way. Methods and Findings We included data from 24 randomised controlled trials, evaluating antiplatelet agents, for the prevention of pre-eclampsia in pregnancy. We performed two-stage and one-stage IPD meta-analyses to estimate overall treatment effect and to explore potential treatment interactions whereby particular types of women and their babies might benefit differentially from receiving antiplatelets. Two-stage and one-stage approaches gave similar results, showing a benefit of using anti-platelets (Relative risk 0.90, 95% CI 0.84 to 0.97). Neither approach suggested that any particular type of women benefited more or less from antiplatelets. There were no material differences in results between different types of one-stage model. Conclusions For these data, two-stage and one-stage approaches to analysis produce similar results. Although one-stage models offer a flexible environment for exploring model structure and are useful where across study patterns relating to types of participant, intervention and outcome mask similar relationships within trials, the additional insights provided by their usage may not outweigh the costs of statistical support for routine application in syntheses of randomised controlled trials. Researchers considering undertaking an IPD meta-analysis should not necessarily be deterred by a perceived need for sophisticated statistical methods when combining information from large randomised trials. PMID:23056232
NASA Astrophysics Data System (ADS)
Callahan, Brendan E.
There is a distinct divide between theory and practice in American science education. Research indicates that a constructivist philosophy, in which students construct their own knowledge, is conductive to learning, while in many cases teachers continue to present science in a more traditional manner. This study sought to explore possible relationships between a socioscientific issues based curriculum and three outcome variables: nature of science understanding, reflective judgment, and argumentation skill. Both quantitative and qualitative methods were used to examine both whole class differences as well as individual differences between the beginning and end of a semester of high school Biology I. Results indicated that the socioscientific issues based curriculum did not produce statistically significant changes over the course of one semester. However, the treatment group scored better on all three instruments than the comparison group. The small sample size may have contributed to the inability to find statistical significance in this study. The qualitative interviews did indicate that some students provided more sophisticated views on nature of science and reflective judgment, and were able to provide slightly more complex argumentation structures. Theoretical implications regarding the use of explicit use of socioscientific issues in the classroom are presented.
Kernel methods and flexible inference for complex stochastic dynamics
NASA Astrophysics Data System (ADS)
Capobianco, Enrico
2008-07-01
Approximation theory suggests that series expansions and projections represent standard tools for random process applications from both numerical and statistical standpoints. Such instruments emphasize the role of both sparsity and smoothness for compression purposes, the decorrelation power achieved in the expansion coefficients space compared to the signal space, and the reproducing kernel property when some special conditions are met. We consider these three aspects central to the discussion in this paper, and attempt to analyze the characteristics of some known approximation instruments employed in a complex application domain such as financial market time series. Volatility models are often built ad hoc, parametrically and through very sophisticated methodologies. But they can hardly deal with stochastic processes with regard to non-Gaussianity, covariance non-stationarity or complex dependence without paying a big price in terms of either model mis-specification or computational efficiency. It is thus a good idea to look at other more flexible inference tools; hence the strategy of combining greedy approximation and space dimensionality reduction techniques, which are less dependent on distributional assumptions and more targeted to achieve computationally efficient performances. Advantages and limitations of their use will be evaluated by looking at algorithmic and model building strategies, and by reporting statistical diagnostics.
Overview of artificial neural networks.
Zou, Jinming; Han, Yi; So, Sung-Sau
2008-01-01
The artificial neural network (ANN), or simply neural network, is a machine learning method evolved from the idea of simulating the human brain. The data explosion in modem drug discovery research requires sophisticated analysis methods to uncover the hidden causal relationships between single or multiple responses and a large set of properties. The ANN is one of many versatile tools to meet the demand in drug discovery modeling. Compared to a traditional regression approach, the ANN is capable of modeling complex nonlinear relationships. The ANN also has excellent fault tolerance and is fast and highly scalable with parallel processing. This chapter introduces the background of ANN development and outlines the basic concepts crucially important for understanding more sophisticated ANN. Several commonly used learning methods and network setups are discussed briefly at the end of the chapter.
Improved Design of Tunnel Supports : Executive Summary
DOT National Transportation Integrated Search
1979-12-01
This report focuses on improvement of design methodologies related to the ground-structure interaction in tunneling. The design methods range from simple analytical and empirical methods to sophisticated finite element techniques as well as an evalua...
NASA Astrophysics Data System (ADS)
Patton, David R.; Qamar, Farid D.; Ellison, Sara L.; Bluck, Asa F. L.; Simard, Luc; Mendel, J. Trevor; Moreno, Jorge; Torrey, Paul
2016-09-01
We describe a statistical approach for measuring the influence that a galaxy's closest companion has on the galaxy's properties out to arbitrarily wide separations. We begin by identifying the closest companion for every galaxy in a large spectroscopic sample of Sloan Digital Sky Survey galaxies. We then characterize the local environment of each galaxy by using the number of galaxies within 2 Mpc and by determining the isolation of the galaxy pair from other neighbouring galaxies. We introduce a sophisticated algorithm for creating a statistical control sample for each galaxy, matching on stellar mass, redshift, local density and isolation. Unlike traditional studies of close galaxy pairs, this approach is effective in a wide range of environments, regardless of how faraway the closest companion is (although a very distant closest companion is unlikely to have a measurable influence on the galaxy in question). We apply this methodology to measurements of galaxy asymmetry, and find that the presence of nearby companions drives a clear enhancement in galaxy asymmetries. The asymmetry excess peaks at the smallest projected separations (<10 kpc), where the mean asymmetry is enhanced by a factor of 2.0 ± 0.2. Enhancements in mean asymmetry decline as pair separation increases, but remain statistically significant (1σ-2σ) out to projected separations of at least 50 kpc.
Shiraishi, Y; Yambe, T; Saijo, Y; Sato, F; Tanaka, A; Yoshizawa, M; Sugai, T K; Sakata, R; Luo, Y; Park, Y; Uematsu, M; Umezu, M; Fujimoto, T; Masumoto, N; Liu, H; Baba, A; Konno, S; Nitta, S; Imachi, K; Tabayashi, K; Sasada, H; Homma, D
2008-01-01
The authors have been developing an artificial myocardium, which is capable of supporting natural contractile function from the outside of the ventricle. The system was originally designed by using sophisticated covalent shape memory alloy fibres, and the surface did not implicate blood compatibility. The purpose of our study on the development of artificial myocardium was to achieve the assistance of myocardial functional reproduction by the integrative small mechanical elements without sensors, so that the effective circulatory support could be accomplished. In this study, the authors fabricated the prototype artificial myocardial assist unit composed of the sophisticated shape memory alloy fibre (Biometal), the diameter of which was 100 microns, and examined the mechanical response by using pulse width modulation (PWM) control method in each unit. Prior to the evaluation of dynamic characteristics, the relationship between strain and electric resistance and also the initial response of each unit were obtained. The component for the PWM control was designed in order to regulate the myocardial contractile function, which consisted of an originally-designed RISC microcomputer with the input of displacement, and its output signal was controlled by pulse wave modulation method. As a result, the optimal PWM parameters were confirmed and the fibrous displacement was successfully regulated under the different heat transfer conditions simulating internal body temperature as well as bias tensile loading. Then it was indicated that this control theory might be applied for more sophisticated ventricular passive or active restraint by the artificial myocardium on physiological demand.
Brown, Gregory G; Anderson, Vicki; Bigler, Erin D; Chan, Agnes S; Fama, Rosemary; Grabowski, Thomas J; Zakzanis, Konstantine K
2017-11-01
The American Psychological Association (APA) celebrated its 125th anniversary in 2017. As part of this celebration, the APA journal Neuropsychology has published in its November 2017 issue 11 papers describing some of the advances in the field of neuropsychology over the past 25 years. The papers address three broad topics: assessment and intervention, brain imaging, and theory and methods. The papers describe the rise of new assessment and intervention technologies, the impact of evidence for neuroplasticity on neurorehabilitation. Examples of the use of mathematical models of cognition to investigate latent neurobehavioral processes, the development of the field of neuropsychology in select international countries, the increasing sophistication of brain imaging methods, the recent evidence for localizationist and connectionist accounts of neurobehavioral functioning, the advances in neurobehavioral genomics, and descriptions of newly developed statistical models of longitudinal change. Together the papers convey evidence of the vibrant growth in the field of neuropsychology over the quarter century since APA's 100th anniversary in 1992. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Gould, William R.; Kendall, William L.
2013-01-01
Capture-recapture methods were initially developed to estimate human population abundance, but since that time have seen widespread use for fish and wildlife populations to estimate and model various parameters of population, metapopulation, and disease dynamics. Repeated sampling of marked animals provides information for estimating abundance and tracking the fate of individuals in the face of imperfect detection. Mark types have evolved from clipping or tagging to use of noninvasive methods such as photography of natural markings and DNA collection from feces. Survival estimation has been emphasized more recently as have transition probabilities between life history states and/or geographical locations, even where some states are unobservable or uncertain. Sophisticated software has been developed to handle highly parameterized models, including environmental and individual covariates, to conduct model selection, and to employ various estimation approaches such as maximum likelihood and Bayesian approaches. With these user-friendly tools, complex statistical models for studying population dynamics have been made available to ecologists. The future will include a continuing trend toward integrating data types, both for tagged and untagged individuals, to produce more precise and robust population models.
Handbook of capture-recapture analysis
Amstrup, Steven C.; McDonald, Trent L.; Manly, Bryan F.J.
2005-01-01
Every day, biologists in parkas, raincoats, and rubber boots go into the field to capture and mark a variety of animal species. Back in the office, statisticians create analytical models for the field biologists' data. But many times, representatives of the two professions do not fully understand one another's roles. This book bridges this gap by helping biologists understand state-of-the-art statistical methods for analyzing capture-recapture data. In so doing, statisticians will also become more familiar with the design of field studies and with the real-life issues facing biologists.Reliable outcomes of capture-recapture studies are vital to answering key ecological questions. Is the population increasing or decreasing? Do more or fewer animals have a particular characteristic? In answering these questions, biologists cannot hope to capture and mark entire populations. And frequently, the populations change unpredictably during a study. Thus, increasingly sophisticated models have been employed to convert data into answers to ecological questions. This book, by experts in capture-recapture analysis, introduces the most up-to-date methods for data analysis while explaining the theory behind those methods. Thorough, concise, and portable, it will be immensely useful to biologists, biometricians, and statisticians, students in both fields, and anyone else engaged in the capture-recapture process.
A ganglion-cell-based primary image representation method and its contribution to object recognition
NASA Astrophysics Data System (ADS)
Wei, Hui; Dai, Zhi-Long; Zuo, Qing-Song
2016-10-01
A visual stimulus is represented by the biological visual system at several levels: in the order from low to high levels, they are: photoreceptor cells, ganglion cells (GCs), lateral geniculate nucleus cells and visual cortical neurons. Retinal GCs at the early level need to represent raw data only once, but meet a wide number of diverse requests from different vision-based tasks. This means the information representation at this level is general and not task-specific. Neurobiological findings have attributed this universal adaptation to GCs' receptive field (RF) mechanisms. For the purposes of developing a highly efficient image representation method that can facilitate information processing and interpretation at later stages, here we design a computational model to simulate the GC's non-classical RF. This new image presentation method can extract major structural features from raw data, and is consistent with other statistical measures of the image. Based on the new representation, the performances of other state-of-the-art algorithms in contour detection and segmentation can be upgraded remarkably. This work concludes that applying sophisticated representation schema at early state is an efficient and promising strategy in visual information processing.
Alanna Conners and the Origins of Principled Data Analysis
NASA Technical Reports Server (NTRS)
Scargle, Jeffrey D.
2013-01-01
Alanna was one of the most important pioneers in the development of not just sophisticated algorithms for analyzing astronomical data but more importantly an overall viewpoint emphasizing the use of statistically sound principles in place of blind application of cook-book recipes, or black boxes. I will outline some of the threads of this viewpoint, emphasizing time series data, with a focus on the importance of these developments for the Age of Digital Astronomy that we are entering.
Sherrill, Joel T; Sommers, David I; Nierenberg, Andrew A; Leon, Andrew C; Arndt, Stephan; Bandeen-Roche, Karen; Greenhouse, Joel; Guthrie, Donald; Normand, Sharon-Lise; Phillips, Katharine A; Shear, M Katherine; Woolson, Robert
2009-01-01
The authors summarize points for consideration generated in a National Institute of Mental Health (NIMH) workshop convened to provide an opportunity for reviewers from different disciplines-specifically clinical researchers and statisticians-to discuss how their differing and complementary expertise can be well integrated in the review of intervention-related grant applications. A 1-day workshop was convened in October, 2004. The workshop featured panel presentations on key topics followed by interactive discussion. This article summarizes the workshop and subsequent discussions, which centered on topics including weighting the statistics/data analysis elements of an application in the assessment of the application's overall merit; the level of statistical sophistication appropriate to different stages of research and for different funding mechanisms; some key considerations in the design and analysis portions of applications; appropriate statistical methods for addressing essential questions posed by an application; and the role of the statistician in the application's development, study conduct, and interpretation and dissemination of results. A number of key elements crucial to the construction and review of grant applications were identified. It was acknowledged that intervention-related studies unavoidably involve trade-offs. Reviewers are helped when applications acknowledge such trade-offs and provide good rationale for their choices. Clear linkage among the design, aims, hypotheses, and data analysis plan and avoidance of disconnections among these elements also strengthens applications. The authors identify multiple points to consider when constructing intervention-related grant applications. The points are presented here as questions and do not reflect institute policy or comprise a list of best practices, but rather represent points for consideration.
Survival analysis and classification methods for forest fire size
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at “being held” (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at “being held” exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances. PMID:29320497
Survival analysis and classification methods for forest fire size.
Tremblay, Pier-Olivier; Duchesne, Thierry; Cumming, Steven G
2018-01-01
Factors affecting wildland-fire size distribution include weather, fuels, and fire suppression activities. We present a novel application of survival analysis to quantify the effects of these factors on a sample of sizes of lightning-caused fires from Alberta, Canada. Two events were observed for each fire: the size at initial assessment (by the first fire fighters to arrive at the scene) and the size at "being held" (a state when no further increase in size is expected). We developed a statistical classifier to try to predict cases where there will be a growth in fire size (i.e., the size at "being held" exceeds the size at initial assessment). Logistic regression was preferred over two alternative classifiers, with covariates consistent with similar past analyses. We conducted survival analysis on the group of fires exhibiting a size increase. A screening process selected three covariates: an index of fire weather at the day the fire started, the fuel type burning at initial assessment, and a factor for the type and capabilities of the method of initial attack. The Cox proportional hazards model performed better than three accelerated failure time alternatives. Both fire weather and fuel type were highly significant, with effects consistent with known fire behaviour. The effects of initial attack method were not statistically significant, but did suggest a reverse causality that could arise if fire management agencies were to dispatch resources based on a-priori assessment of fire growth potentials. We discuss how a more sophisticated analysis of larger data sets could produce unbiased estimates of fire suppression effect under such circumstances.
Toward a Model-Based Approach to the Clinical Assessment of Personality Psychopathology
Eaton, Nicholas R.; Krueger, Robert F.; Docherty, Anna R.; Sponheim, Scott R.
2015-01-01
Recent years have witnessed tremendous growth in the scope and sophistication of statistical methods available to explore the latent structure of psychopathology, involving continuous, discrete, and hybrid latent variables. The availability of such methods has fostered optimism that they can facilitate movement from classification primarily crafted through expert consensus to classification derived from empirically-based models of psychopathological variation. The explication of diagnostic constructs with empirically supported structures can then facilitate the development of assessment tools that appropriately characterize these constructs. Our goal in this paper is to illustrate how new statistical methods can inform conceptualization of personality psychopathology and therefore its assessment. We use magical thinking as example, because both theory and earlier empirical work suggested the possibility of discrete aspects to the latent structure of personality psychopathology, particularly forms of psychopathology involving distortions of reality testing, yet other data suggest that personality psychopathology is generally continuous in nature. We directly compared the fit of a variety of latent variable models to magical thinking data from a sample enriched with clinically significant variation in psychotic symptomatology for explanatory purposes. Findings generally suggested a continuous latent variable model best represented magical thinking, but results varied somewhat depending on different indices of model fit. We discuss the implications of the findings for classification and applied personality assessment. We also highlight some limitations of this type of approach that are illustrated by these data, including the importance of substantive interpretation, in addition to use of model fit indices, when evaluating competing structural models. PMID:24007309
Some case studies of geophysical exploration of archaeological sites in Yugoslavia
NASA Astrophysics Data System (ADS)
Komatina, Snezana; Timotijevic, Zoran
1999-03-01
One of the youngest branches of environmental geophysics application is the preservation of national heritage. Numerous digital techniques developed for exploration directed to urban planning can also be applied to investigations of historic buildings. In identifying near-surface layers containing objects of previous civilizations, various sophisticated geophysical methods are used. In the paper, application of geophysics in quantification of possible problems necessary to be carried out in order to get an archaeological map of some locality is discussed [Komatina, S., 1996]. Sophisticated geophysical methods in the preservation of national heritage. Proc. of Int. Conf. Architecture and Urbanism at the turn of the Millenium, Beograd, pp. 39-44. Finally, several examples of archaeogeophysical exploration at Divostin, Bedem and Kalenic monastery localities (Serbia, Yugoslavia) are presented.
Statistical mechanics of complex economies
NASA Astrophysics Data System (ADS)
Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo
2017-04-01
In the pursuit of ever increasing efficiency and growth, our economies have evolved to remarkable degrees of complexity, with nested production processes feeding each other in order to create products of greater sophistication from less sophisticated ones, down to raw materials. The engine of such an expansion have been competitive markets that, according to general equilibrium theory (GET), achieve efficient allocations under specific conditions. We study large random economies within the GET framework, as templates of complex economies, and we find that a non-trivial phase transition occurs: the economy freezes in a state where all production processes collapse when either the number of primary goods or the number of available technologies fall below a critical threshold. As in other examples of phase transitions in large random systems, this is an unintended consequence of the growth in complexity. Our findings suggest that the Industrial Revolution can be regarded as a sharp transition between different phases, but also imply that well developed economies can collapse if too many intermediate goods are introduced.
Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.
Textbook of respiratory medicine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murray, J.F.; Nadel, J.
1987-01-01
This book presents a clinical reference of respiratory medicine. It also details basic science aspects of pulmonary physiology and describes recently developed, sophisticated diagnostic tools and therapeutic methods. It also covers anatomy, physiology, pharmacology, and pathology; microbiologic, radiologic, nuclear medicine, and biopsy methods for diagnosis.
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Philosophy and the practice of Bayesian statistics
Gelman, Andrew; Shalizi, Cosma Rohilla
2015-01-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575
Philosophy and the practice of Bayesian statistics.
Gelman, Andrew; Shalizi, Cosma Rohilla
2013-02-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. © 2012 The British Psychological Society.
Tian, Feifei; Tan, Rui; Guo, Tailin; Zhou, Peng; Yang, Li
2013-07-01
Domain-peptide recognition and interaction are fundamentally important for eukaryotic signaling and regulatory networks. It is thus essential to quantitatively infer the binding stability and specificity of such interaction based upon large-scale but low-accurate complex structure models which could be readily obtained from sophisticated molecular modeling procedure. In the present study, a new method is described for the fast and reliable prediction of domain-peptide binding affinity with coarse-grained structure models. This method is designed to tolerate strong random noises involved in domain-peptide complex structures and uses statistical modeling approach to eliminate systematic bias associated with a group of investigated samples. As a paradigm, this method was employed to model and predict the binding behavior of various peptides to four evolutionarily unrelated peptide-recognition domains (PRDs), i.e. human amph SH3, human nherf PDZ, yeast syh GYF and yeast bmh 14-3-3, and moreover, we explored the molecular mechanism and biological implication underlying the binding of cognate and noncognate peptide ligands to their domain receptors. It is expected that the newly proposed method could be further used to perform genome-wide inference of domain-peptide binding at three-dimensional structure level. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Shrout, Patrick E; Rodgers, Joseph L
2018-01-04
Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.
Research for Environmental Stewardship and Conservation at the APTRU
USDA-ARS?s Scientific Manuscript database
Research methods for mitigation of off-target spray drift, remote sensing for precision crop management, and irrigation and tillage methods are presented. Research for mitigation of off target spray drift includes development of sophisticated weather apparatus to determine weather conditions unfavor...
Neo-Sophistic Rhetorical Theory: Sophistic Precedents for Contemporary Epistemic Rhetoric.
ERIC Educational Resources Information Center
McComiskey, Bruce
Interest in the sophists has recently intensified among rhetorical theorists, culminating in the notion that rhetoric is epistemic. Epistemic rhetoric has its first and deepest roots in sophistic epistemological and rhetorical traditions, so that the view of rhetoric as epistemic is now being dubbed "neo-sophistic." In epistemic…
Tropical Cyclone Report: Joint Typhoon Warning Center Guam, Mariana Islands, 1991
1991-01-01
provided by the weather unit supporting synoptic time plus three hours (0300Z, 0900Z, the 15th Air Base Wing , Hickam AFB, Hawaii. 1500Z and 2100Z). By...DYNAMIC 5.2.3.1 CLIMATOLOGY AND PERSISTENCE 5.2.4.1 NOGAPS VORTEX TRACKING ( CLIP ) - A statistical regression technique that ROUTINE (NGPS) - This...forecast skill of other in the expected vicinity of the storm is more sophisticated techniques. CLIP in the conducted every six hours through 72 hours
Inelastic Single Pion Signal Study in T2K νe Appearance using Modified Decay Electron Cut
NASA Astrophysics Data System (ADS)
Iwamoto, Konosuke; T2K Collaboration
2015-04-01
The T2K long-baseline neutrino experiment uses sophisticated selection criteria to identify the neutrino oscillation signals among the events reconstructed in the Super-Kamiokande (SK) detector for νe and νμ appearance and disappearance analyses. In current analyses, charged-current quasi-elastic (CCQE) events are used as the signal reaction in the SK detector because the energy can be precisely reconstructed. This talk presents an approach to increase the statistics of the oscillation analysis by including non-CCQE events with one Michel electron and reconstruct them as the inelastic single pion productions. The increase in statistics, backgrounds to this new process and energy reconstruction implications will be presented with this increased event sample.
Stationarity: Wanted dead or alive?
Lins, Larry F.; Cohn, Timothy A.
2011-01-01
Aligning engineering practice with natural process behavior would appear, on its face, to be a prudent and reasonable course of action. However, if we do not understand the long-term characteristics of hydroclimatic processes, how does one find the prudent and reasonable course needed for water management? We consider this question in light of three aspects of existing and unresolved issues affecting hydroclimatic variability and statistical inference: Hurst-Kolmogorov phenomena; the complications long-term persistence introduces with respect to statistical understanding; and the dependence of process understanding on arbitrary sampling choices. These problems are not easily addressed. In such circumstances, humility may be more important than physics; a simple model with well-understood flaws may be preferable to a sophisticated model whose correspondence to reality is uncertain.
In My End Is My Beginning: eLearning at the Crossroads
ERIC Educational Resources Information Center
Blackburn, Greg
2016-01-01
The increasingly popularity of eLearning does not refer to a specific educational method of instruction nor method of delivery. The design can have different meanings depending on the sophistication of the educational method employed, the resources made available, and the educator's skills. Unfortunately the application of technology in education…
Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations
ERIC Educational Resources Information Center
Leese, Wallace R.
2013-01-01
The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…
Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations
ERIC Educational Resources Information Center
Leese, Wallace R.; Kizirian, Tim
2009-01-01
The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…
Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows
2015-06-01
sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994
The Healing Effect of Nettle Extract on Second Degree Burn Wounds
Akbari, Hosein; Fatemi, Mohammad Javad; Iranpour, Maryam; Khodarahmi, Ali; Baghaee, Mehrdad; Pedram, Mir Sepehr; Saleh, Sahar; Araghi, Shirin
2015-01-01
BACKGROUND Numerous studies were carried out to develop more sophisticated dressings to expedite healing processes and diminish the bacterial burden in burn wounds. This study assessed the healing effect of nettle extract on second degree burns wound in rats in comparison with silver sulfadiazine and vaseline. METHODS Forty rats were randomly assigned to four equal groups. A deep second-degree burn was created on the back of each rat using a standard burning procedure. The burns were dressed daily with nettle extract in group 1, silver sulfadiazine in group 2, vaseline in group 3 and without any medication in group 4 as control group. The response to treatment was assessed by digital photography during the treatment until day 42. Histological scoring was undertaken for scar tissue samples on days 10 and 42. RESULTS A statistically significant difference was observed in group 1 compared with other groups regarding 4 scoring parameters after 10 days. A statistically significant difference was seen for fibrosis parameter after 42 days. In terms of difference of wound surface area, maximal healing was noticed at the same time in nettle group and minimal repair in the control group. CONCLUSION Our findings showed maximal rate of healing in the nettle group. So it may be a suitable substitute for silver sulfadiazine and vaseline when available. PMID:25606473
Pore fluids and the LGM ocean salinity-Reconsidered
NASA Astrophysics Data System (ADS)
Wunsch, Carl
2016-03-01
Pore fluid chlorinity/salinity data from deep-sea cores related to the salinity maximum of the last glacial maximum (LGM) are analyzed using estimation methods deriving from linear control theory. With conventional diffusion coefficient values and no vertical advection, results show a very strong dependence upon initial conditions at -100 ky. Earlier inferences that the abyssal Southern Ocean was strongly salt-stratified in the LGM with a relatively fresh North Atlantic Ocean are found to be consistent within uncertainties of the salinity determination, which remain of order ±1 g/kg. However, an LGM Southern Ocean abyss with an important relative excess of salt is an assumption, one not required by existing core data. None of the present results show statistically significant abyssal salinity values above the global average, and results remain consistent, apart from a general increase owing to diminished sea level, with a more conventional salinity distribution having deep values lower than the global mean. The Southern Ocean core does show a higher salinity than the North Atlantic one on the Bermuda Rise at different water depths. Although much more sophisticated models of the pore-fluid salinity can be used, they will only increase the resulting uncertainties, unless considerably more data can be obtained. Results are consistent with complex regional variations in abyssal salinity during deglaciation, but none are statistically significant.
Imbalanced target prediction with pattern discovery on clinical data repositories.
Chan, Tak-Ming; Li, Yuxi; Chiau, Choo-Chiap; Zhu, Jane; Jiang, Jie; Huo, Yong
2017-04-20
Clinical data repositories (CDR) have great potential to improve outcome prediction and risk modeling. However, most clinical studies require careful study design, dedicated data collection efforts, and sophisticated modeling techniques before a hypothesis can be tested. We aim to bridge this gap, so that clinical domain users can perform first-hand prediction on existing repository data without complicated handling, and obtain insightful patterns of imbalanced targets for a formal study before it is conducted. We specifically target for interpretability for domain users where the model can be conveniently explained and applied in clinical practice. We propose an interpretable pattern model which is noise (missing) tolerant for practice data. To address the challenge of imbalanced targets of interest in clinical research, e.g., deaths less than a few percent, the geometric mean of sensitivity and specificity (G-mean) optimization criterion is employed, with which a simple but effective heuristic algorithm is developed. We compared pattern discovery to clinically interpretable methods on two retrospective clinical datasets. They contain 14.9% deaths in 1 year in the thoracic dataset and 9.1% deaths in the cardiac dataset, respectively. In spite of the imbalance challenge shown on other methods, pattern discovery consistently shows competitive cross-validated prediction performance. Compared to logistic regression, Naïve Bayes, and decision tree, pattern discovery achieves statistically significant (p-values < 0.01, Wilcoxon signed rank test) favorable averaged testing G-means and F1-scores (harmonic mean of precision and sensitivity). Without requiring sophisticated technical processing of data and tweaking, the prediction performance of pattern discovery is consistently comparable to the best achievable performance. Pattern discovery has demonstrated to be robust and valuable for target prediction on existing clinical data repositories with imbalance and noise. The prediction results and interpretable patterns can provide insights in an agile and inexpensive way for the potential formal studies.
MetaboLyzer: A Novel Statistical Workflow for Analyzing Post-Processed LC/MS Metabolomics Data
Mak, Tytus D.; Laiakis, Evagelia C.; Goudarzi, Maryam; Fornace, Albert J.
2014-01-01
Metabolomics, the global study of small molecules in a particular system, has in the last few years risen to become a primary –omics platform for the study of metabolic processes. With the ever-increasing pool of quantitative data yielded from metabolomic research, specialized methods and tools with which to analyze and extract meaningful conclusions from these data are becoming more and more crucial. Furthermore, the depth of knowledge and expertise required to undertake a metabolomics oriented study is a daunting obstacle to investigators new to the field. As such, we have created a new statistical analysis workflow, MetaboLyzer, which aims to both simplify analysis for investigators new to metabolomics, as well as provide experienced investigators the flexibility to conduct sophisticated analysis. MetaboLyzer’s workflow is specifically tailored to the unique characteristics and idiosyncrasies of postprocessed liquid chromatography/mass spectrometry (LC/MS) based metabolomic datasets. It utilizes a wide gamut of statistical tests, procedures, and methodologies that belong to classical biostatistics, as well as several novel statistical techniques that we have developed specifically for metabolomics data. Furthermore, MetaboLyzer conducts rapid putative ion identification and putative biologically relevant analysis via incorporation of four major small molecule databases: KEGG, HMDB, Lipid Maps, and BioCyc. MetaboLyzer incorporates these aspects into a comprehensive workflow that outputs easy to understand statistically significant and potentially biologically relevant information in the form of heatmaps, volcano plots, 3D visualization plots, correlation maps, and metabolic pathway hit histograms. For demonstration purposes, a urine metabolomics data set from a previously reported radiobiology study in which samples were collected from mice exposed to gamma radiation was analyzed. MetaboLyzer was able to identify 243 statistically significant ions out of a total of 1942. Numerous putative metabolites and pathways were found to be biologically significant from the putative ion identification workflow. PMID:24266674
NASA Astrophysics Data System (ADS)
Follette, Katherine B.; McCarthy, D. W.
2012-01-01
We present preliminary results from a student survey designed to test whether the all-important life skill of numeracy/quantitative literacy can be fostered and improved upon in college students through the vehicle of non-major introductory courses in Astronomy. Many instructors of introductory science courses for non-majors would state that a major goal of our classes is to teach our students to distinguish between science and pseudoscience, truth and fiction, in their everyday lives. It is difficult to believe that such a skill can truly be mastered without a fair amount of mathematical sophistication in the form of arithmetic, statistical and graph reading skills that many American college students unfortunately lack when they enter our classrooms. In teaching what is frequently their "terminal science course in life” can we instill in our students the numerical skills that they need to be savvy consumers, educated citizens and discerning interpreters of the ever-present polls, studies and surveys in which our society is awash? In what may well be their final opportunity to see applied mathematics in the classroom, can we impress upon them the importance of mathematical sophistication in interpreting the statistics that they are bombarded with by the media? Our study is in its second semester, and is designed to investigate to what extent it is possible to improve important quantitative skills in college students through a single semester introductory Astronomy course.
Changing epistemological beliefs: the unexpected impact of a short-term intervention.
Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar
2008-12-01
Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. To investigate the potential for influencing domain-specific epistemological beliefs through a short instructional intervention. On the basis of their performance on a preliminary survey of epistemological beliefs, 58 students at a German university (87.7% females) with a mean age of 21.86 years (SD=2.88) were selected. Half of them had more naive beliefs and the other half had more sophisticated ones. Participants were randomly assigned to one of two groups: one whose epistemological beliefs were challenged through refutational epistemological instruction or another receiving non-challenging informational instruction. The treatment effect was assessed by comparing pre- and post-instructional scores on two instruments tapping different layers of epistemological beliefs (DEBQ and CAEB). Data were subjected to factor analyses and analyses of variance. According to the CAEB, the naive group receiving the refutational epistemological instruction changed towards a more sophisticated view, whereas the sophisticated students receiving the informational instruction changed towards an unintended, more naive standpoint. According to the DEBQ, all research groups except the naive refutational group revealed changes towards a more naive view. This study indicates the possibility of changing domain-specific epistemological beliefs through a short-term intervention. However, it questions the stability and elaborateness of domain-specific epistemological beliefs, particularly when domain knowledge is shallow.
Bayesian probability of success for clinical trials using historical data
Ibrahim, Joseph G.; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F.; Heyse, Joseph F.
2015-01-01
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein’s work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. PMID:25339499
A comparison of methods of fitting several models to nutritional response data.
Vedenov, D; Pesti, G M
2008-02-01
A variety of models have been proposed to fit nutritional input-output response data. The models are typically nonlinear; therefore, fitting the models usually requires sophisticated statistical software and training to use it. An alternative tool for fitting nutritional response models was developed by using widely available and easier-to-use Microsoft Excel software. The tool, implemented as an Excel workbook (NRM.xls), allows simultaneous fitting and side-by-side comparisons of several popular models. This study compared the results produced by the tool we developed and PROC NLIN of SAS. The models compared were the broken line (ascending linear and quadratic segments), saturation kinetics, 4-parameter logistics, sigmoidal, and exponential models. The NRM.xls workbook provided results nearly identical to those of PROC NLIN. Furthermore, the workbook successfully fit several models that failed to converge in PROC NLIN. Two data sets were used as examples to compare fits by the different models. The results suggest that no particular nonlinear model is necessarily best for all nutritional response data.
Consequences of recent loophole-free experiments on a relaxation of measurement independence
NASA Astrophysics Data System (ADS)
Hnilo, Alejandro A.
2017-02-01
Recent experiments using innovative optical detectors and techniques have strongly increased the capacity of testing the violation of the Bell's inequalities in Nature. Most of them have used the Eberhard's inequality (EI) to close the "detection" loophole. Closing the "locality" loophole has been attempted by spacelike separated detections and fast changes of the bases of observation, driven by random number generators of new design. Also, pulsed pumping and time-resolved data recording to close the "time-coincidence" loophole, and sophisticated statistical methods to close the "memory" loophole, have been used. In this paper, the meaning of the EI is reviewed. A simple hidden variables theory based on a relaxation of the condition of "measurement independence," which was devised long ago for the Clauser-Horne-Shimony and Holt inequality, is adapted to the EI case. It is used here to evaluate the significance of the results of the recent experiments, which are briefly described. A table summarizes the main results.
ToF-SIMS PCA analysis of Myrtus communis L.
NASA Astrophysics Data System (ADS)
Piras, F. M.; Dettori, M. F.; Magnani, A.
2009-06-01
Nowadays there is a growing interest of researchers for the application of sophisticated analytical techniques in conjunction with statistical data analysis methods to the characterization of natural products to assure their authenticity and quality, and for the possibility of direct analysis of food to obtain maximum information. In this work, time-of-flight secondary ion mass spectrometry (ToF-SIMS) in conjunction with principal components analysis (PCA) are applied to study the chemical composition and variability of Sardinian myrtle ( Myrtus communis L.) through the analysis of both berries alcoholic extracts and berries epicarp. ToF-SIMS spectra of berries epicarp show that the epicuticular waxes consist mainly of carboxylic acids with chain length ranging from C20 to C30, or identical species formed from fragmentation of long-chain esters. PCA of ToF-SIMS data from myrtle berries epicarp distinguishes two groups characterized by a different surface concentration of triacontanoic acid. Variability in antocyanins, flavonols, α-tocopherol, and myrtucommulone contents is showed by ToF-SIMS PCA analysis of myrtle berries alcoholic extracts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lieu, Richard
A hierarchy of statistics of increasing sophistication and accuracy is proposed to exploit an interesting and fundamental arithmetic structure in the photon bunching noise of incoherent light of large photon occupation number, with the purpose of suppressing the noise and rendering a more reliable and unbiased measurement of the light intensity. The method does not require any new hardware, rather it operates at the software level with the help of high-precision computers to reprocess the intensity time series of the incident light to create a new series with smaller bunching noise coherence length. The ultimate accuracy improvement of this methodmore » of flux measurement is limited by the timing resolution of the detector and the photon occupation number of the beam (the higher the photon number the better the performance). The principal application is accuracy improvement in the signal-limited bolometric flux measurement of a radio source.« less
Bayesian probability of success for clinical trials using historical data.
Ibrahim, Joseph G; Chen, Ming-Hui; Lakshminarayanan, Mani; Liu, Guanghan F; Heyse, Joseph F
2015-01-30
Developing sophisticated statistical methods for go/no-go decisions is crucial for clinical trials, as planning phase III or phase IV trials is costly and time consuming. In this paper, we develop a novel Bayesian methodology for determining the probability of success of a treatment regimen on the basis of the current data of a given trial. We introduce a new criterion for calculating the probability of success that allows for inclusion of covariates as well as allowing for historical data based on the treatment regimen, and patient characteristics. A new class of prior distributions and covariate distributions is developed to achieve this goal. The methodology is quite general and can be used with univariate or multivariate continuous or discrete data, and it generalizes Chuang-Stein's work. This methodology will be invaluable for informing the scientist on the likelihood of success of the compound, while including the information of covariates for patient characteristics in the trial population for planning future pre-market or post-market trials. Copyright © 2014 John Wiley & Sons, Ltd.
Morning glory syndrome associated with primary open angle glaucoma--case report.
Bozić, Marija; Hentova-Senćanić, Paraskeva; Marković, Vujica; Marjanović, Ivan
2014-01-01
Morning glory syndrome (MGS) is a rare congenital optic disc anomaly, first reported in 1970. MGS is a nonprogressive and untreatable condition, which usually occurs as an isolated ocular anomaly, and can be associated with the increased incidence of nonrhegmatogenous retinal detachment, and also with strabismus, afferent pupillary defect, visual field defects, presence of hyaloids artery remnants, ciliary body cyst, congenital cataract, lid hemangioma and preretinal gliosis. We report a clinical case of MGS associated with primary open angle glaucoma. The use of sophisticated diagnostic tools, such as retinal tomography and visual field testing is limited if multiple eye conditions are present, since optic disc does not have "usual" appearance that can be analyzed according to standard statistical databases. In treating and follow up of glaucoma cases associated with other diseases and conditions that affect the appearance and function of the optic nerve head, sometimes the use of modern technological methods is limited due to difficult interpretation of the obtained results.
NASA Astrophysics Data System (ADS)
Kashinski, D. O.; Nelson, R. G.; Chase, G. M.; di Nallo, O. E.; Byrd, E. F. C.
2016-05-01
We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled harmonic frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). Calculation of approximate global harmonic frequency scaling factors for specific DFT functionals is also in progress. A full statistical analysis and reliability assessment of computational results is currently underway. Work supported by the ARL, DoD-HPCMP, and USMA.
Bayesian depth estimation from monocular natural images.
Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C
2017-05-01
Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.
ERIC Educational Resources Information Center
Davis, Gary Alan; Kovacs, Paul J.; Scarpino, John; Turchek, John C.
2010-01-01
The emergence of increasingly sophisticated communication technologies and the media-rich extensions of the World Wide Web have prompted universities to use alternatives to the traditional classroom teaching and learning methods. This demand for alternative delivery methods has led to the development of a wide range of eLearning techniques.…
Shepherd, Marilyn Murphy; Wipke-Tevis, Deidre D.; Alexander, Gregory L.
2015-01-01
Purpose The purpose of this study was to compare pressure ulcer prevention programs in 2 long term care facilities (LTC) with diverse Information Technology Sophistication (ITS), one with high sophistication and one with low sophistication, and to identify implications for the Wound Ostomy Continence Nurse (WOC Nurse) Design Secondary analysis of narrative data obtained from a mixed methods study. Subjects and Setting The study setting was 2 LTC facilities in the Midwestern United States. The sample comprised 39 staff from 2 facilities, including 26 from a high ITS facility and 13 from the low ITS facility. Respondents included Certified Nurse Assistants,, Certified Medical Technicians, Restorative Medical Technicians, Social Workers, Registered Nurses, Licensed Practical Nurses, Information Technology staff, Administrators, and Directors. Methods This study is a secondary analysis of interviews regarding communication and education strategies in two longterm care agencies. This analysis focused on focus group interviews, which included both direct and non-direct care providers. Results Eight themes (codes) were identified in the analysis. Three themes are presented individually with exemplars of communication and education strategies. The analysis revealed specific differences between the high ITS and low ITS facility in regards to education and communication involving pressure ulcer prevention. These differences have direct implications for WOC nurses consulting in the LTC setting. Conclusions Findings from this study suggest that effective strategies for staff education and communication regarding PU prevention differ based on the level of ITS within a given facility. Specific strategies for education and communication are suggested for agencies with high ITS and agencies with low ITS sophistication. PMID:25945822
NASA Technical Reports Server (NTRS)
Hirsch, Annette L.; Kala, Jatin; Pitman, Andy J.; Carouge, Claire; Evans, Jason P.; Haverd, Vanessa; Mocko, David
2014-01-01
The authors use a sophisticated coupled land-atmosphere modeling system for a Southern Hemisphere subdomain centered over southeastern Australia to evaluate differences in simulation skill from two different land surface initialization approaches. The first approach uses equilibrated land surface states obtained from offline simulations of the land surface model, and the second uses land surface states obtained from reanalyses. The authors find that land surface initialization using prior offline simulations contribute to relative gains in subseasonal forecast skill. In particular, relative gains in forecast skill for temperature of 10%-20% within the first 30 days of the forecast can be attributed to the land surface initialization method using offline states. For precipitation there is no distinct preference for the land surface initialization method, with limited gains in forecast skill irrespective of the lead time. The authors evaluated the asymmetry between maximum and minimum temperatures and found that maximum temperatures had the largest gains in relative forecast skill, exceeding 20% in some regions. These results were statistically significant at the 98% confidence level at up to 60 days into the forecast period. For minimum temperature, using reanalyses to initialize the land surface contributed to relative gains in forecast skill, reaching 40% in parts of the domain that were statistically significant at the 98% confidence level. The contrasting impact of the land surface initialization method between maximum and minimum temperature was associated with different soil moisture coupling mechanisms. Therefore, land surface initialization from prior offline simulations does improve predictability for temperature, particularly maximum temperature, but with less obvious improvements for precipitation and minimum temperature over southeastern Australia.
The Delphi Method in Rehabilitation Counseling Research
ERIC Educational Resources Information Center
Vazquez-Ramos, Robinson; Leahy, Michael; Estrada Hernandez, Noel
2007-01-01
Rehabilitation researchers have found in the application of the Delphi method a more sophisticated way of obtaining consensus from experts in the field on certain matters. The application of this research methodology has affected and certainly advanced the body of knowledge of the rehabilitation counseling practice. However, the rehabilitation…
Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method
ERIC Educational Resources Information Center
Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo
2012-01-01
This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)
USDA-ARS?s Scientific Manuscript database
Increasing availability of genomic data and sophistication of analytical methodology in fungi has elevated the need for functional genomics tools in these organisms. Gene deletion is a critical tool for functional analysis. The targeted deletion of genes requires both a suitable method for the trans...
NASA Astrophysics Data System (ADS)
Li, Tianfang; Wang, Jing; Wen, Junhai; Li, Xiang; Lu, Hongbing; Hsieh, Jiang; Liang, Zhengrong
2004-05-01
To treat the noise in low-dose x-ray CT projection data more accurately, analysis of the noise properties of the data and development of a corresponding efficient noise treatment method are two major problems to be addressed. In order to obtain an accurate and realistic model to describe the x-ray CT system, we acquired thousands of repeated measurements on different phantoms at several fixed scan angles by a GE high-speed multi-slice spiral CT scanner. The collected data were calibrated and log-transformed by the sophisticated system software, which converts the detected photon energy into sinogram data that satisfies the Radon transform. From the analysis of these experimental data, a nonlinear relation between mean and variance for each datum of the sinogram was obtained. In this paper, we integrated this nonlinear relation into a penalized likelihood statistical framework for a SNR (signal-to-noise ratio) adaptive smoothing of noise in the sinogram. After the proposed preprocessing, the sinograms were reconstructed with unapodized FBP (filtered backprojection) method. The resulted images were evaluated quantitatively, in terms of noise uniformity and noise-resolution tradeoff, with comparison to other noise smoothing methods such as Hanning filter and Butterworth filter at different cutoff frequencies. Significant improvement on noise and resolution tradeoff and noise property was demonstrated.
Noninvasive fetal QRS detection using an echo state network and dynamic programming.
Lukoševičius, Mantas; Marozas, Vaidotas
2014-08-01
We address a classical fetal QRS detection problem from abdominal ECG recordings with a data-driven statistical machine learning approach. Our goal is to have a powerful, yet conceptually clean, solution. There are two novel key components at the heart of our approach: an echo state recurrent neural network that is trained to indicate fetal QRS complexes, and several increasingly sophisticated versions of statistics-based dynamic programming algorithms, which are derived from and rooted in probability theory. We also employ a standard technique for preprocessing and removing maternal ECG complexes from the signals, but do not take this as the main focus of this work. The proposed approach is quite generic and can be extended to other types of signals and annotations. Open-source code is provided.
Biosignature Discovery for Substance Use Disorders Using Statistical Learning.
Baurley, James W; McMahan, Christopher S; Ervin, Carolyn M; Pardamean, Bens; Bergen, Andrew W
2018-02-01
There are limited biomarkers for substance use disorders (SUDs). Traditional statistical approaches are identifying simple biomarkers in large samples, but clinical use cases are still being established. High-throughput clinical, imaging, and 'omic' technologies are generating data from SUD studies and may lead to more sophisticated and clinically useful models. However, analytic strategies suited for high-dimensional data are not regularly used. We review strategies for identifying biomarkers and biosignatures from high-dimensional data types. Focusing on penalized regression and Bayesian approaches, we address how to leverage evidence from existing studies and knowledge bases, using nicotine metabolism as an example. We posit that big data and machine learning approaches will considerably advance SUD biomarker discovery. However, translation to clinical practice, will require integrated scientific efforts. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A; Robert, Christian P; Marin, Jean-Michel; Balding, David J; Guillemaud, Thomas; Estoup, Arnaud
2008-12-01
Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc.
'Stranger' child-murder: issues relating to causes and controls.
Wilson, P R
1988-02-01
Most industrialised countries are concerned with a perceived increase in the killing of children and adolescents by strangers. Though reliable statistics are lacking, the growth of serial murder suggests that more young persons may be at risk than ever before. Explanations, either of a psychological or sociological kind, of child murder by strangers are inadequately developed. Despite the tendency to see such killers as psychiatrically ill a number of studies suggest that the majority of offenders do not differ significantly, at least in psychological traits, from non-offenders. Subcultural and other sociological perspectives stressing "social disadvantage" have low levels of exploratory power and do not assist greatly in understanding child killings. Despite sketchy and contradictory evidence on the effects of the media on sexual and violent crime case study material supports the view that pornography, including popular music, may increase the propensity of individuals to commit atrocities. Counter-measures to control stranger child killing lie in more sophisticated law enforcement (profiling and computer links between police forces) long periods of incarceration of the offender and more sophisticated analyses of the crimes.
Vocal repertoire of the social giant otter.
Leuchtenberger, Caroline; Sousa-Lima, Renata; Duplaix, Nicole; Magnusson, William E; Mourão, Guilherme
2014-11-01
According to the "social intelligence hypothesis," species with complex social interactions have more sophisticated communication systems. Giant otters (Pteronura brasiliensis) live in groups with complex social interactions. It is likely that the vocal communication of giant otters is more sophisticated than previous studies suggest. The objectives of the current study were to describe the airborne vocal repertoire of giant otters in the Pantanal area of Brazil, to analyze call types within different behavioral contexts, and to correlate vocal complexity with level of sociability of mustelids to verify whether or not the result supports the social intelligence hypothesis. The behavior of nine giant otters groups was observed. Vocalizations recorded were acoustically and statistically analyzed to describe the species' repertoire. The repertoire was comprised by 15 sound types emitted in different behavioral contexts. The main behavioral contexts of each sound type were significantly associated with the acoustic variable ordination of different sound types. A strong correlation between vocal complexity and sociability was found for different species, suggesting that the communication systems observed in the family mustelidae support the social intelligence hypothesis.
Intelligent form removal with character stroke preservation
NASA Astrophysics Data System (ADS)
Garris, Michael D.
1996-03-01
A new technique for intelligent form removal has been developed along with a new method for evaluating its impact on optical character recognition (OCR). All the dominant lines in the image are automatically detected using the Hough line transform and intelligently erased while simultaneously preserving overlapping character strokes by computing line width statistics and keying off of certain visual cues. This new method of form removal operates on loosely defined zones with no image deskewing. Any field in which the writer is provided a horizontal line to enter a response can be processed by this method. Several examples of processed fields are provided, including a comparison of results between the new method and a commercially available forms removal package. Even if this new form removal method did not improve character recognition accuracy, it is still a significant improvement to the technology because the requirement of a priori knowledge of the form's geometric details has been greatly reduced. This relaxes the recognition system's dependence on rigid form design, printing, and reproduction by automatically detecting and removing some of the physical structures (lines) on the form. Using the National Institute of Standards and Technology (NIST) public domain form-based handprint recognition system, the technique was tested on a large number of fields containing randomly ordered handprinted lowercase alphabets, as these letters (especially those with descenders) frequently touch and extend through the line along which they are written. Preserving character strokes improves overall lowercase recognition performance by 3%, which is a net improvement, but a single performance number like this doesn't communicate how the recognition process was really influenced. There is expected to be trade- offs with the introduction of any new technique into a complex recognition system. To understand both the improvements and the trade-offs, a new analysis was designed to compare the statistical distributions of individual confusion pairs between two systems. As OCR technology continues to improve, sophisticated analyses like this are necessary to reduce the errors remaining in complex recognition problems.
[Quality assessment in anesthesia].
Kupperwasser, B
1996-01-01
Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.
PFAAT version 2.0: a tool for editing, annotating, and analyzing multiple sequence alignments.
Caffrey, Daniel R; Dana, Paul H; Mathur, Vidhya; Ocano, Marco; Hong, Eun-Jong; Wang, Yaoyu E; Somaroo, Shyamal; Caffrey, Brian E; Potluri, Shobha; Huang, Enoch S
2007-10-11
By virtue of their shared ancestry, homologous sequences are similar in their structure and function. Consequently, multiple sequence alignments are routinely used to identify trends that relate to function. This type of analysis is particularly productive when it is combined with structural and phylogenetic analysis. Here we describe the release of PFAAT version 2.0, a tool for editing, analyzing, and annotating multiple sequence alignments. Support for multiple annotations is a key component of this release as it provides a framework for most of the new functionalities. The sequence annotations are accessible from the alignment and tree, where they are typically used to label sequences or hyperlink them to related databases. Sequence annotations can be created manually or extracted automatically from UniProt entries. Once a multiple sequence alignment is populated with sequence annotations, sequences can be easily selected and sorted through a sophisticated search dialog. The selected sequences can be further analyzed using statistical methods that explicitly model relationships between the sequence annotations and residue properties. Residue annotations are accessible from the alignment viewer and are typically used to designate binding sites or properties for a particular residue. Residue annotations are also searchable, and allow one to quickly select alignment columns for further sequence analysis, e.g. computing percent identities. Other features include: novel algorithms to compute sequence conservation, mapping conservation scores to a 3D structure in Jmol, displaying secondary structure elements, and sorting sequences by residue composition. PFAAT provides a framework whereby end-users can specify knowledge for a protein family in the form of annotation. The annotations can be combined with sophisticated analysis to test hypothesis that relate to sequence, structure and function.
NLSE: Parameter-Based Inversion Algorithm
NASA Astrophysics Data System (ADS)
Sabbagh, Harold A.; Murphy, R. Kim; Sabbagh, Elias H.; Aldrin, John C.; Knopp, Jeremy S.
Chapter 11 introduced us to the notion of an inverse problem and gave us some examples of the value of this idea to the solution of realistic industrial problems. The basic inversion algorithm described in Chap. 11 was based upon the Gauss-Newton theory of nonlinear least-squares estimation and is called NLSE in this book. In this chapter we will develop the mathematical background of this theory more fully, because this algorithm will be the foundation of inverse methods and their applications during the remainder of this book. We hope, thereby, to introduce the reader to the application of sophisticated mathematical concepts to engineering practice without introducing excessive mathematical sophistication.
Lloyd, G C
1996-01-01
Contends that as techniques to motivate, empower and reward staff become ever more sophisticated and expensive, one of the most obvious, though overlooked, ways of tapping the creativity of employees is the suggestion scheme. A staff suggestion scheme may well be dismissed as a simplistic and outdated vehicle by proponents of modern management methods, but to its owners it can be like a classic model--needing just a little care and attention in order for it to run smoothly and at a very low cost. Proposes that readers should spare some time to consider introducing a suggestion scheme as an entry level initiative and a precursor to more sophisticated, elaborate and costly change management mechanisms.
NASA Astrophysics Data System (ADS)
Price-Whelan, Adrian M.
2016-01-01
Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.
Bi-telescopic, deep, simultaneous meteor observations
NASA Technical Reports Server (NTRS)
Taff, L. G.
1986-01-01
A statistical summary is presented of 10 hours of observing sporadic meteors and two meteor showers using the Experimental Test System of the Lincoln Laboratory. The observatory is briefly described along with the real-time and post-processing hardware, the analysis, and the data reduction. The principal observational results are given for the sporadic meteor zenithal hourly rates. The unique properties of the observatory include twin telescopes to allow the discrimination of meteors by parallax, deep limiting magnitude, good time resolution, and sophisticated real-time and post-observing video processing.
Meta-analysis in applied ecology.
Stewart, Gavin
2010-02-23
This overview examines research synthesis in applied ecology and conservation. Vote counting and pooling unweighted averages are widespread despite the superiority of syntheses based on weighted combination of effects. Such analyses allow exploration of methodological uncertainty in addition to consistency of effects across species, space and time, but exploring heterogeneity remains controversial. Meta-analyses are required to generalize in ecology, and to inform evidence-based decision-making, but the more sophisticated statistical techniques and registers of research used in other disciplines must be employed in ecology to fully realize their benefits.
Statistical models for causation: what inferential leverage do they provide?
Freedman, David A
2006-12-01
Experiments offer more reliable evidence on causation than observational studies, which is not to gainsay the contribution to knowledge from observation. Experiments should be analyzed as experiments, not as observational studies. A simple comparison of rates might be just the right tool, with little value added by "sophisticated" models. This article discusses current models for causation, as applied to experimental and observational data. The intention-to-treat principle and the effect of treatment on the treated will also be discussed. Flaws in per-protocol and treatment-received estimates will be demonstrated.
Platonic Dialogue, Maieutic Method and Critical Thinking
ERIC Educational Resources Information Center
Leigh, Fiona
2007-01-01
In this paper I offer a reading of one of Plato's later works, the "Sophist", that reveals it to be informed by principles comparable on the face of it with those that have emerged recently in the field of critical thinking. As a development of the famous Socratic method of his teacher, I argue, Plato deployed his own pedagogical method, a…
ERIC Educational Resources Information Center
Kies, Cosette
1975-01-01
A discussion of the way marketing expertise has employed sophisticated and psychological methods in packing a variety of products, including those items stocked by libraries and media centers; books, records, periodicals and audio-visual materials. (Author)
Koutsoukas, Alexios; Monaghan, Keith J; Li, Xiaoli; Huan, Jun
2017-06-28
In recent years, research in artificial neural networks has resurged, now under the deep-learning umbrella, and grown extremely popular. Recently reported success of DL techniques in crowd-sourced QSAR and predictive toxicology competitions has showcased these methods as powerful tools in drug-discovery and toxicology research. The aim of this work was dual, first large number of hyper-parameter configurations were explored to investigate how they affect the performance of DNNs and could act as starting points when tuning DNNs and second their performance was compared to popular methods widely employed in the field of cheminformatics namely Naïve Bayes, k-nearest neighbor, random forest and support vector machines. Moreover, robustness of machine learning methods to different levels of artificially introduced noise was assessed. The open-source Caffe deep-learning framework and modern NVidia GPU units were utilized to carry out this study, allowing large number of DNN configurations to be explored. We show that feed-forward deep neural networks are capable of achieving strong classification performance and outperform shallow methods across diverse activity classes when optimized. Hyper-parameters that were found to play critical role are the activation function, dropout regularization, number hidden layers and number of neurons. When compared to the rest methods, tuned DNNs were found to statistically outperform, with p value <0.01 based on Wilcoxon statistical test. DNN achieved on average MCC units of 0.149 higher than NB, 0.092 than kNN, 0.052 than SVM with linear kernel, 0.021 than RF and finally 0.009 higher than SVM with radial basis function kernel. When exploring robustness to noise, non-linear methods were found to perform well when dealing with low levels of noise, lower than or equal to 20%, however when dealing with higher levels of noise, higher than 30%, the Naïve Bayes method was found to perform well and even outperform at the highest level of noise 50% more sophisticated methods across several datasets.
NASA Astrophysics Data System (ADS)
Wang, Bingyuan; Zhang, Yao; Liu, Dongyuan; Ding, Xuemei; Dan, Mai; Pan, Tiantian; Wang, Yihan; Li, Jiao; Zhou, Zhongxing; Zhang, Limin; Zhao, Huijuan; Gao, Feng
2018-02-01
Functional near-infrared spectroscopy (fNIRS) is a non-invasive neuroimaging method to monitor the cerebral hemodynamic through the optical changes measured at the scalp surface. It has played a more and more important role in psychology and medical imaging communities. Real-time imaging of brain function using NIRS makes it possible to explore some sophisticated human brain functions unexplored before. Kalman estimator has been frequently used in combination with modified Beer-Lamber Law (MBLL) based optical topology (OT), for real-time brain function imaging. However, the spatial resolution of the OT is low, hampering the application of OT in exploring some complicated brain functions. In this paper, we develop a real-time imaging method combining diffuse optical tomography (DOT) and Kalman estimator, much improving the spatial resolution. Instead of only presenting one spatially distributed image indicating the changes of the absorption coefficients at each time point during the recording process, one real-time updated image using the Kalman estimator is provided. Its each voxel represents the amplitude of the hemodynamic response function (HRF) associated with this voxel. We evaluate this method using some simulation experiments, demonstrating that this method can obtain more reliable spatial resolution images. Furthermore, a statistical analysis is also conducted to help to decide whether a voxel in the field of view is activated or not.
Towards an automatic wind speed and direction profiler for Wide Field adaptive optics systems
NASA Astrophysics Data System (ADS)
Sivo, G.; Turchi, A.; Masciadri, E.; Guesalaga, A.; Neichel, B.
2018-05-01
Wide Field Adaptive Optics (WFAO) systems are among the most sophisticated adaptive optics (AO) systems available today on large telescopes. Knowledge of the vertical spatio-temporal distribution of wind speed (WS) and direction (WD) is fundamental to optimize the performance of such systems. Previous studies already proved that the Gemini Multi-Conjugated AO system (GeMS) is able to retrieve measurements of the WS and WD stratification using the SLOpe Detection And Ranging (SLODAR) technique and to store measurements in the telemetry data. In order to assess the reliability of these estimates and of the SLODAR technique applied to such complex AO systems, in this study we compared WS and WD values retrieved from GeMS with those obtained with the atmospheric model Meso-NH on a rich statistical sample of nights. It has previously been proved that the latter technique provided excellent agreement with a large sample of radiosoundings, both in statistical terms and on individual flights. It can be considered, therefore, as an independent reference. The excellent agreement between GeMS measurements and the model that we find in this study proves the robustness of the SLODAR approach. To bypass the complex procedures necessary to achieve automatic measurements of the wind with GeMS, we propose a simple automatic method to monitor nightly WS and WD using Meso-NH model estimates. Such a method can be applied to whatever present or new-generation facilities are supported by WFAO systems. The interest of this study is, therefore, well beyond the optimization of GeMS performance.
Machine cost analysis using the traditional machine-rate method and ChargeOut!
E. M. (Ted) Bilek
2009-01-01
Forestry operations require ever more use of expensive capital equipment. Mechanization is frequently necessary to perform cost-effective and safe operations. Increased capital should mean more sophisticated capital costing methodologies. However the machine rate method, which is the costing methodology most frequently used, dates back to 1942. CHARGEOUT!, a recently...
Shan Gao; Xiping Wang; Michael C. Wiemann; Brian K. Brashaw; Robert J. Ross; Lihai Wang
2017-01-01
Key message Field methods for rapid determination of wood density in trees have evolved from increment borer, torsiometer, Pilodyn, and nail withdrawal into sophisticated electronic tools of resistance drilling measurement. A partial resistance drilling approach coupled with knowledge of internal tree density distribution may...
Sophistry, the Sophists and modern medical education.
Macsuibhne, S P
2010-01-01
The term 'sophist' has become a term of intellectual abuse in both general discourse and that of educational theory. However the actual thought of the fifth century BC Athenian-based philosophers who were the original Sophists was very different from the caricature. In this essay, I draw parallels between trends in modern medical educational practice and the thought of the Sophists. Specific areas discussed are the professionalisation of medical education, the teaching of higher-order characterological attributes such as personal development skills, and evidence-based medical education. Using the specific example of the Sophist Protagoras, it is argued that the Sophists were precursors of philosophical approaches and practices of enquiry underlying modern medical education.
Studying DNA in the Classroom.
ERIC Educational Resources Information Center
Zarins, Silja
1993-01-01
Outlines a workshop for teachers that illustrates a method of extracting DNA and provides instructions on how to do some simple work with DNA without sophisticated and expensive equipment. Provides details on viscosity studies and breaking DNA molecules. (DDR)
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
Overview of Non-Volatile Testing and Screening Methods
NASA Technical Reports Server (NTRS)
Irom, Farokh
2001-01-01
Testing methods for memories and non-volatile memories have become increasingly sophisticated as they become denser and more complex. High frequency and faster rewrite times as well as smaller feature sizes have led to many testing challenges. This paper outlines several testing issues posed by novel memories and approaches to testing for radiation and reliability effects. We discuss methods for measurements of Total Ionizing Dose (TID).
ERIC Educational Resources Information Center
Charconnet, Marie-George
This study describes various patterns of peer tutoring and is based on the use of cultural traditions and endogenous methods, on techniques and equipment acquired from other cultures, on problems presented by the adoption of educational technologies, and on methods needing little sophisticated equipment. A dozen peer tutoring systems are…
A methodological analysis of chaplaincy research: 2000-2009.
Galek, Kathleen; Flannelly, Kevin J; Jankowski, Katherine R B; Handzo, George F
2011-01-01
The present article presents a comprehensive review and analysis of quantitative research conducted in the United States on chaplaincy and closely related topics published between 2000 and 2009. A combined search strategy identified 49 quantitative studies in 13 journals. The analysis focuses on the methodological sophistication of the studies, compared to earlier research on chaplaincy and pastoral care. Cross-sectional surveys of convenience samples still dominate the field, but sample sizes have increased somewhat over the past three decades. Reporting of the validity and reliability of measures continues to be low, although reporting of response rates has improved. Improvements in the use of inferential statistics and statistical controls were also observed, compared to previous research. The authors conclude that more experimental research is needed on chaplaincy, along with an increased use of hypothesis testing, regardless of the research designs that are used.
Automated speech understanding: the next generation
NASA Astrophysics Data System (ADS)
Picone, J.; Ebel, W. J.; Deshmukh, N.
1995-04-01
Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.
Remote sensing for urban planning
NASA Technical Reports Server (NTRS)
Davis, Bruce A.; Schmidt, Nicholas; Jensen, John R.; Cowen, Dave J.; Halls, Joanne; Narumalani, Sunil; Burgess, Bryan
1994-01-01
Utility companies are challenged to provide services to a highly dynamic customer base. With factory closures and shifts in employment becoming a routine occurrence, the utility industry must develop new techniques to maintain records and plan for expected growth. BellSouth Telecommunications, the largest of the Bell telephone companies, currently serves over 13 million residences and 2 million commercial customers. Tracking the movement of customers and scheduling the delivery of service are major tasks for BellSouth that require intensive manpower and sophisticated information management techniques. Through NASA's Commercial Remote Sensing Program Office, BellSouth is investigating the utility of remote sensing and geographic information system techniques to forecast residential development. This paper highlights the initial results of this project, which indicate a high correlation between the U.S. Bureau of Census block group statistics and statistics derived from remote sensing data.
NASA Astrophysics Data System (ADS)
Morgenthaler, George W.; Nuñez, German R.; Botello, Aaron M.; Soto, Jose; Shrairman, Ruth; Landau, Alexander
1998-01-01
Many reaction time experiments have been conducted over the years to observe human responses. However, most of the experiments that were performed did not have quantitatively accurate instruments for measuring change in reaction time under stress. There is a great need for quantitative instruments to measure neuromuscular reaction responses under stressful conditions such as distraction, disorientation, disease, alcohol, drugs, etc. The two instruments used in the experiments reported in this paper are such devices. Their accuracy, portability, ease of use, and biometric character are what makes them very special. PACE™ is a software model used to measure reaction time. VeriFax's Impairoscope measures the deterioration of neuromuscular responses. During the 1997 Summer Semester, various reaction time experiments were conducted on University of Colorado faculty, staff, and students using the PACE™ system. The tests included both two-eye and one-eye unstressed trials and trials with various stresses such as fatigue, distractions in which subjects were asked to perform simple arithmetic during the PACE™ tests, and stress due to rotating-chair dizziness. Various VeriFax Impairoscope tests, both stressed and unstressed, were conducted to determine the Impairoscope's ability to quantitatively measure this impairment. In the 1997 Fall Semester, a Phase II effort was undertaken to increase test sample sizes in order to provide statistical precision and stability. More sophisticated statistical methods remain to be applied to better interpret the data.
Chin, Kok-Yong; Low, Nie Yen; Kamaruddin, Alia Annessa Ain; Dewiputri, Wan Ilma; Soelaiman, Ima-Nirwana
2017-01-01
Background Calcaneal quantitative ultrasound (QUS) is a useful tool in osteoporosis screening. However, QUS device may not be available at all primary health care settings. Osteoporosis self-assessment tool for Asians (OSTA) is a simple algorithm for osteoporosis screening that does not require any sophisticated instruments. This study explored the possibility of replacing QUS with OSTA by determining their agreement in identifying individuals at risk of osteoporosis. Methods A cross-sectional study was conducted to recruit Malaysian men and women aged ≥50 years. Their bone health status was measured using a calcaneal QUS device and OSTA. The association between OSTA and QUS was determined using Spearman’s correlation and their agreement was assessed using Cohen Kappa and receiver-operating curve. Results All QUS indices correlated significantly with OSTA (p<0.05). The agreement between QUS and OSTA was minimal but statistically significant (p<0.05). The performance of OSTA in identifying subjects at risk of osteoporosis according to QUS was poor-to-fair in women (p<0.05), but not statistically significant for men (p>0.05). Changing the cut-off values improved the performance of OSTA in women but not in men. Conclusion The agreement between QUS and OSTA is minimal in categorizing individuals at risk of osteoporosis. Therefore, they cannot be used interchangeably in osteoporosis screening. PMID:29070951
Optimal spectral tracking--adapting to dynamic regime change.
Brittain, John-Stuart; Halliday, David M
2011-01-30
Real world data do not always obey the statistical restraints imposed upon them by sophisticated analysis techniques. In spectral analysis for instance, an ergodic process--the interchangeability of temporal for spatial averaging--is assumed for a repeat-trial design. Many evolutionary scenarios, such as learning and motor consolidation, do not conform to such linear behaviour and should be approached from a more flexible perspective. To this end we previously introduced the method of optimal spectral tracking (OST) in the study of trial-varying parameters. In this extension to our work we modify the OST routines to provide an adaptive implementation capable of reacting to dynamic transitions in the underlying system state. In so doing, we generalise our approach to characterise both slow-varying and rapid fluctuations in time-series, simultaneously providing a metric of system stability. The approach is first applied to a surrogate dataset and compared to both our original non-adaptive solution and spectrogram approaches. The adaptive OST is seen to display fast convergence and desirable statistical properties. All three approaches are then applied to a neurophysiological recording obtained during a study on anaesthetic monitoring. Local field potentials acquired from the posterior hypothalamic region of a deep brain stimulation patient undergoing anaesthesia were analysed. The characterisation of features such as response delay, time-to-peak and modulation brevity are considered. Copyright © 2010 Elsevier B.V. All rights reserved.
Peripheral Optics with Bifocal Soft and Corneal Reshaping Contact Lenses
Ticak, Anita; Walline, Jeffrey J.
2012-01-01
Purpose To determine whether bifocal soft contact lenses with a distance center design provide myopic defocus to the peripheral retina similar to corneal reshaping contact lenses. Methods Myopic subjects underwent five cycloplegic autorefraction readings centrally and at 10, 20, and 30 degrees temporally, nasally, superiorly, inferiorly while wearing a Proclear Multifocal “D” contact lens with a +2.00 D add (CooperVision, Fairport, NY) and after wearing a Corneal Refractive Therapy (Paragon Vision Sciences, Mesa, AZ) contact lens for two weeks Results Fourteen subjects completed the study. Nine (64%) were female, and 12 (86%) were Caucasian. The average (± standard deviation) spherical equivalent non-cycloplegic manifest refraction for the right eye was −2.84 ± 1.29 D. The average logMAR best-corrected, binocular high contrast visual acuity was −0.17 ± 0.15 while wearing the bifocal soft contact lens, and −0.09 ± 0.16 following corneal reshaping contact lens wear (ANOVA, p = 0.27). The orthokeratology contact lens yielded a more myopic peripheral optical profile than the soft bifocal contact lens at 20 and 30 degrees eccentricity (except inferior at 20 degrees); the two modalities were similar at 10 degrees eccentricity. Conclusions Our data suggest that the two modalities are dissimilar despite the statistical similarities. The corneal reshaping contact lens shows an increase in relative peripheral myopic refraction, a pattern achieved by other studies, but the bifocal lens does not exhibit such a pattern. The low statistical power of the study could be a reason for a lack of providing statistical difference in other positions of gaze, but the graphical representation of the data shows a marked difference in peripheral optical profile between the two modalities. More sophisticated methods of measuring the peripheral optical profile may be necessary to accurately compare the two modalities and to determine the true optical effect of the bifocal soft contact lens on the peripheral retina. PMID:23222924
Assessing the performance of sewer rehabilitation on the reduction of infiltration and inflow.
Staufer, P; Scheidegger, A; Rieckermann, J
2012-10-15
Inflow and Infiltration (I/I) into sewer systems is generally unwanted, because, among other things, it decreases the performance of wastewater treatment plants and increases combined sewage overflows. As sewer rehabilitation to reduce I/I is very expensive, water managers not only need methods to accurately measure I/I, but also they need sound approaches to assess the actual performance of implemented rehabilitation measures. However, such performance assessment is rarely performed. On the one hand, it is challenging to adequately take into account the variability of influential factors, such as hydro-meteorological conditions. On the other hand, it is currently not clear how experimental data can indeed support robust evidence for reduced I/I. In this paper, we therefore statistically assess the performance of rehabilitation measures to reduce I/I. This is possible by using observations in a suitable reference catchment as a control group and assessing the significance of the observed effect by regression analysis, which is well established in other disciplines. We successfully demonstrate the usefulness of the approach in a case study, where rehabilitation reduced groundwater infiltration by 23.9%. A reduction of stormwater inflow of 35.7%, however, was not statistically significant. Investigations into the experimental design of monitoring campaigns confirmed that the variability of the data as well as the number of observations collected before the rehabilitation impact the detection limit of the effect. This implies that it is difficult to improve the data quality after the rehabilitation has been implemented. Therefore, future practical applications should consider a careful experimental design. Further developments could employ more sophisticated monitoring methods, such as stable environmental isotopes, to directly observe the individual infiltration components. In addition, water managers should develop strategies to effectively communicate statistically not significant I/I reduction ratios to decision makers. Copyright © 2012 Elsevier Ltd. All rights reserved.
Data Analysis and Data Mining: Current Issues in Biomedical Informatics
Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.
2011-01-01
Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916
The use of isotope ratios (13C/12C) for vegetable oils authentication
NASA Astrophysics Data System (ADS)
Cristea, G.; Magdas, D. A.; Mirel, V.
2012-02-01
Stable isotopes are now increasingly used for the control of the geographical origin or authenticity of food products. The falsification may be more or less sophisticated and its sophistication as well as its costs increases with the improvement of analytical methods. In this study 22 vegetable oils (olive, sunflower, palm, maize) commercialized on Romanian market were investigated by mean of δ13C in bulk oil and the obtained results were compared with those reported in literature in order to check the labeling of these natural products. The obtained results were in the range of the mean values found in the literature for these types of oils, thus providing their accurate labeling.
Control technology for future aircraft propulsion systems
NASA Technical Reports Server (NTRS)
Zeller, J. R.; Szuch, J. R.; Merrill, W. C.; Lehtinen, B.; Soeder, J. F.
1984-01-01
The need for a more sophisticated engine control system is discussed. The improvements in better thrust-to-weight ratios demand the manipulation of more control inputs. New technological solutions to the engine control problem are practiced. The digital electronic engine control (DEEC) system is a step in the evolution to digital electronic engine control. Technology issues are addressed to ensure a growth in confidence in sophisticated electronic controls for aircraft turbine engines. The need of a control system architecture which permits propulsion controls to be functionally integrated with other aircraft systems is established. Areas of technology studied include: (1) control design methodology; (2) improved modeling and simulation methods; and (3) implementation technologies. Objectives, results and future thrusts are summarized.
Automatic initialization and quality control of large-scale cardiac MRI segmentations.
Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F
2018-01-01
Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The results obtained based on over 1200 cases from the Cardiac Atlas Project show the promise of fully automatic initialization and quality control for population studies. Copyright © 2017 Elsevier B.V. All rights reserved.
Scheid, Anika; Nebel, Markus E
2012-07-09
Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case - without sacrificing much of the accuracy of the results. Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms.
2012-01-01
Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case – without sacrificing much of the accuracy of the results. Conclusions Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms. PMID:22776037
Factorization method of quadratic template
NASA Astrophysics Data System (ADS)
Kotyrba, Martin
2017-07-01
Multiplication of two numbers is a one-way function in mathematics. Any attempt to distribute the outcome to its roots is called factorization. There are many methods such as Fermat's factorization, Dixońs method or quadratic sieve and GNFS, which use sophisticated techniques fast factorization. All the above methods use the same basic formula differing only in its use. This article discusses a newly designed factorization method. Effective implementation of this method in programs is not important, it only represents and clearly defines its properties.
EXPOSURE ASSESSMENT IN THE NATIONAL CHILDREN'S STUDY-INTRODUCTION
The science of exposure assessment is relatively new and evolving rapidly with the advancement of sophisticated methods for specific measurements at the picogram per gram level or lower in a variety of environmental and biologic matrices. Without this measurement capability, envi...
Handbook of automated data collection methods for the National Transit Database
DOT National Transportation Integrated Search
2003-10-01
In recent years, with the increasing sophistication and capabilities of information processing technologies, there has been a renewed interest on the part of transit systems to tap the rich information potential of the National Transit Database (NTD)...
ERIC Educational Resources Information Center
Moraes, Edgar P.; da Silva, Nilbert S. A.; de Morais, Camilo de L. M.; das Neves, Luiz S.; de Lima, Kassio M. G.
2014-01-01
The flame test is a classical analytical method that is often used to teach students how to identify specific metals. However, some universities in developing countries have difficulties acquiring the sophisticated instrumentation needed to demonstrate how to identify and quantify metals. In this context, a method was developed based on the flame…
Theory of Mind: Did Evolution Fool Us?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort “I think that you think that I think, etc.”. Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other’s beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the “social Bayesian brain” hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels. PMID:24505296
Theory of mind: did evolution fool us?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
Theory of Mind (ToM) is the ability to attribute mental states (e.g., beliefs and desires) to other people in order to understand and predict their behaviour. If others are rewarded to compete or cooperate with you, then what they will do depends upon what they believe about you. This is the reason why social interaction induces recursive ToM, of the sort "I think that you think that I think, etc.". Critically, recursion is the common notion behind the definition of sophistication of human language, strategic thinking in games, and, arguably, ToM. Although sophisticated ToM is believed to have high adaptive fitness, broad experimental evidence from behavioural economics, experimental psychology and linguistics point towards limited recursivity in representing other's beliefs. In this work, we test whether such apparent limitation may not in fact be proven to be adaptive, i.e. optimal in an evolutionary sense. First, we propose a meta-Bayesian approach that can predict the behaviour of ToM sophistication phenotypes who engage in social interactions. Second, we measure their adaptive fitness using evolutionary game theory. Our main contribution is to show that one does not have to appeal to biological costs to explain our limited ToM sophistication. In fact, the evolutionary cost/benefit ratio of ToM sophistication is non trivial. This is partly because an informational cost prevents highly sophisticated ToM phenotypes to fully exploit less sophisticated ones (in a competitive context). In addition, cooperation surprisingly favours lower levels of ToM sophistication. Taken together, these quantitative corollaries of the "social Bayesian brain" hypothesis provide an evolutionary account for both the limitation of ToM sophistication in humans as well as the persistence of low ToM sophistication levels.
Statistical fluctuations of an ocean surface inferred from shoes and ships
NASA Astrophysics Data System (ADS)
Lerche, Ian; Maubeuge, Frédéric
1995-12-01
This paper shows that it is possible to roughly estimate some ocean properties using simple time-dependent statistical models of ocean fluctuations. Based on a real incident, the loss by a vessel of a Nike shoes container in the North Pacific Ocean, a statistical model was tested on data sets consisting of the Nike shoes found by beachcombers a few months later. This statistical treatment of the shoes' motion allows one to infer velocity trends of the Pacific Ocean, together with their fluctuation strengths. The idea is to suppose that there is a mean bulk flow speed that can depend on location on the ocean surface and time. The fluctuations of the surface flow speed are then treated as statistically random. The distribution of shoes is described in space and time using Markov probability processes related to the mean and fluctuating ocean properties. The aim of the exercise is to provide some of the properties of the Pacific Ocean that are otherwise calculated using a sophisticated numerical model, OSCURS, where numerous data are needed. Relevant quantities are sharply estimated, which can be useful to (1) constrain output results from OSCURS computations, and (2) elucidate the behavior patterns of ocean flow characteristics on long time scales.
Uncertainty in Vs30-based site response
Thompson, Eric M.; Wald, David J.
2016-01-01
Methods that account for site response range in complexity from simple linear categorical adjustment factors to sophisticated nonlinear constitutive models. Seismic‐hazard analysis usually relies on ground‐motion prediction equations (GMPEs); within this framework site response is modeled statistically with simplified site parameters that include the time‐averaged shear‐wave velocity to 30 m (VS30) and basin depth parameters. Because VS30 is not known in most locations, it must be interpolated or inferred through secondary information such as geology or topography. In this article, we analyze a subset of stations for which VS30 has been measured to address effects of VS30 proxies on the uncertainty in the ground motions as modeled by GMPEs. The stations we analyze also include multiple recordings, which allow us to compute the repeatable site effects (or empirical amplification factors [EAFs]) from the ground motions. Although all methods exhibit similar bias, the proxy methods only reduce the ground‐motion standard deviations at long periods when compared to GMPEs without a site term, whereas measured VS30 values reduce the standard deviations at all periods. The standard deviation of the ground motions are much lower when the EAFs are used, indicating that future refinements of the site term in GMPEs have the potential to substantially reduce the overall uncertainty in the prediction of ground motions by GMPEs.
Short-range quantitative precipitation forecasting using Deep Learning approaches
NASA Astrophysics Data System (ADS)
Akbari Asanjan, A.; Yang, T.; Gao, X.; Hsu, K. L.; Sorooshian, S.
2017-12-01
Predicting short-range quantitative precipitation is very important for flood forecasting, early flood warning and other hydrometeorological purposes. This study aims to improve the precipitation forecasting skills using a recently developed and advanced machine learning technique named Long Short-Term Memory (LSTM). The proposed LSTM learns the changing patterns of clouds from Cloud-Top Brightness Temperature (CTBT) images, retrieved from the infrared channel of Geostationary Operational Environmental Satellite (GOES), using a sophisticated and effective learning method. After learning the dynamics of clouds, the LSTM model predicts the upcoming rainy CTBT events. The proposed model is then merged with a precipitation estimation algorithm termed Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks (PERSIANN) to provide precipitation forecasts. The results of merged LSTM with PERSIANN are compared to the results of an Elman-type Recurrent Neural Network (RNN) merged with PERSIANN and Final Analysis of Global Forecast System model over the states of Oklahoma, Florida and Oregon. The performance of each model is investigated during 3 storm events each located over one of the study regions. The results indicate the outperformance of merged LSTM forecasts comparing to the numerical and statistical baselines in terms of Probability of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), RMSE and correlation coefficient especially in convective systems. The proposed method shows superior capabilities in short-term forecasting over compared methods.
Rajasingh, Sheeja; Isai, Dona Greta; Samanta, Saheli; Zhou, Zhi-Gang; Dawn, Buddhadeb; Kinsey, William H; Czirok, Andras; Rajasingh, Johnson
2018-04-05
Induced pluripotent stem cell (iPSC)-based cardiac regenerative medicine requires the efficient generation, structural soundness and proper functioning of mature cardiomyocytes, derived from the patient's somatic cells. The most important functional property of cardiomyocytes is the ability to contract. Currently available methods routinely used to test and quantify cardiomyocyte function involve techniques that are labor-intensive, invasive, require sophisticated instruments or can adversely affect cell vitality. We recently developed optical flow imaging method analyses and quantified cardiomyocyte contractile kinetics from video microscopic recordings without compromising cell quality. Specifically, our automated particle image velocimetry (PIV) analysis of phase-contrast video images captured at a high frame rate yields statistical measures characterizing the beating frequency, amplitude, average waveform and beat-to-beat variations. Thus, it can be a powerful assessment tool to monitor cardiomyocyte quality and maturity. Here we demonstrate the ability of our analysis to characterize the chronotropic responses of human iPSC-derived cardiomyocytes to a panel of ion channel modulators and also to doxorubicin, a chemotherapy agent with known cardiotoxic side effects. We conclude that the PIV-derived beat patterns can identify the elongation or shortening of specific phases in the contractility cycle, and the obtained chronotropic responses are in accord with known clinical outcomes. Hence, this system can serve as a powerful tool to screen the new and currently available pharmacological compounds for cardiotoxic effects.
Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
ERIC Educational Resources Information Center
Gibson, Walker
1993-01-01
Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)
NASA Astrophysics Data System (ADS)
Gloster, Jonathan; Diep, Michael; Dredden, David; Mix, Matthew; Olsen, Mark; Price, Brian; Steil, Betty
2014-06-01
Small-to-medium sized businesses lack resources to deploy and manage high-end advanced solutions to deter sophisticated threats from well-funded adversaries, but evidence shows that these types of businesses are becoming key targets. As malicious code and network attacks become more sophisticated, classic signature-based virus and malware detection methods are less effective. To augment the current malware methods of detection, we developed a proactive approach to detect emerging malware threats using open source tools and intelligence to discover patterns and behaviors of malicious attacks and adversaries. Technical and analytical skills are combined to track adversarial behavior, methods and techniques. We established a controlled (separated domain) network to identify, monitor, and track malware behavior to increase understanding of the methods and techniques used by cyber adversaries. We created a suite of tools that observe the network and system performance looking for anomalies that may be caused by malware. The toolset collects information from open-source tools and provides meaningful indicators that the system was under or has been attacked. When malware is discovered, we analyzed and reverse engineered it to determine how it could be detected and prevented. Results have shown that with minimum resources, cost effective capabilities can be developed to detect abnormal behavior that may indicate malicious software.
Discrete choice experiments of pharmacy services: a systematic review.
Vass, Caroline; Gray, Ewan; Payne, Katherine
2016-06-01
Background Two previous systematic reviews have summarised the application of discrete choice experiments to value preferences for pharmacy services. These reviews identified a total of twelve studies and described how discrete choice experiments have been used to value pharmacy services but did not describe or discuss the application of methods used in the design or analysis. Aims (1) To update the most recent systematic review and critically appraise current discrete choice experiments of pharmacy services in line with published reporting criteria and; (2) To provide an overview of key methodological developments in the design and analysis of discrete choice experiments. Methods The review used a comprehensive strategy to identify eligible studies (published between 1990 and 2015) by searching electronic databases for key terms related to discrete choice and best-worst scaling (BWS) experiments. All healthcare choice experiments were then hand-searched for key terms relating to pharmacy. Data were extracted using a published checklist. Results A total of 17 discrete choice experiments eliciting preferences for pharmacy services were identified for inclusion in the review. No BWS studies were identified. The studies elicited preferences from a variety of populations (pharmacists, patients, students) for a range of pharmacy services. Most studies were from a United Kingdom setting, although examples from Europe, Australia and North America were also identified. Discrete choice experiments for pharmacy services tended to include more attributes than non-pharmacy choice experiments. Few studies reported the use of qualitative research methods in the design and interpretation of the experiments (n = 9) or use of new methods of analysis to identify and quantify preference and scale heterogeneity (n = 4). No studies reported the use of Bayesian methods in their experimental design. Conclusion Incorporating more sophisticated methods in the design of pharmacy-related discrete choice experiments could help researchers produce more efficient experiments which are better suited to valuing complex pharmacy services. Pharmacy-related discrete choice experiments could also benefit from more sophisticated analytical techniques such as investigations into scale and preference heterogeneity. Employing these sophisticated methods for both design and analysis could extend the usefulness of discrete choice experiments to inform health and pharmacy policy.
Jiang, Rui ; Yang, Hua ; Zhou, Linqi ; Kuo, C.-C. Jay ; Sun, Fengzhu ; Chen, Ting
2007-01-01
The increasing demand for the identification of genetic variation responsible for common diseases has translated into a need for sophisticated methods for effectively prioritizing mutations occurring in disease-associated genetic regions. In this article, we prioritize candidate nonsynonymous single-nucleotide polymorphisms (nsSNPs) through a bioinformatics approach that takes advantages of a set of improved numeric features derived from protein-sequence information and a new statistical learning model called “multiple selection rule voting” (MSRV). The sequence-based features can maximize the scope of applications of our approach, and the MSRV model can capture subtle characteristics of individual mutations. Systematic validation of the approach demonstrates that this approach is capable of prioritizing causal mutations for both simple monogenic diseases and complex polygenic diseases. Further studies of familial Alzheimer diseases and diabetes show that the approach can enrich mutations underlying these polygenic diseases among the top of candidate mutations. Application of this approach to unclassified mutations suggests that there are 10 suspicious mutations likely to cause diseases, and there is strong support for this in the literature. PMID:17668383
Machine learning of frustrated classical spin models. I. Principal component analysis
NASA Astrophysics Data System (ADS)
Wang, Ce; Zhai, Hui
2017-10-01
This work aims at determining whether artificial intelligence can recognize a phase transition without prior human knowledge. If this were successful, it could be applied to, for instance, analyzing data from the quantum simulation of unsolved physical models. Toward this goal, we first need to apply the machine learning algorithm to well-understood models and see whether the outputs are consistent with our prior knowledge, which serves as the benchmark for this approach. In this work, we feed the computer data generated by the classical Monte Carlo simulation for the X Y model in frustrated triangular and union jack lattices, which has two order parameters and exhibits two phase transitions. We show that the outputs of the principal component analysis agree very well with our understanding of different orders in different phases, and the temperature dependences of the major components detect the nature and the locations of the phase transitions. Our work offers promise for using machine learning techniques to study sophisticated statistical models, and our results can be further improved by using principal component analysis with kernel tricks and the neural network method.
A Naive Bayes machine learning approach to risk prediction using censored, time-to-event data.
Wolfson, Julian; Bandyopadhyay, Sunayan; Elidrisi, Mohamed; Vazquez-Benitez, Gabriela; Vock, David M; Musgrove, Donald; Adomavicius, Gediminas; Johnson, Paul E; O'Connor, Patrick J
2015-09-20
Predicting an individual's risk of experiencing a future clinical outcome is a statistical task with important consequences for both practicing clinicians and public health experts. Modern observational databases such as electronic health records provide an alternative to the longitudinal cohort studies traditionally used to construct risk models, bringing with them both opportunities and challenges. Large sample sizes and detailed covariate histories enable the use of sophisticated machine learning techniques to uncover complex associations and interactions, but observational databases are often 'messy', with high levels of missing data and incomplete patient follow-up. In this paper, we propose an adaptation of the well-known Naive Bayes machine learning approach to time-to-event outcomes subject to censoring. We compare the predictive performance of our method with the Cox proportional hazards model which is commonly used for risk prediction in healthcare populations, and illustrate its application to prediction of cardiovascular risk using an electronic health record dataset from a large Midwest integrated healthcare system. Copyright © 2015 John Wiley & Sons, Ltd.
Theoretical Characterizaiton of Visual Signatures
NASA Astrophysics Data System (ADS)
Kashinski, D. O.; Chase, G. M.; di Nallo, O. E.; Scales, A. N.; Vanderley, D. L.; Byrd, E. F. C.
2015-05-01
We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled vibrational frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A full statistical analysis and reliability assessment of computational results is currently underway. A comparison of theoretical results to experimental values found in the literature is used to assess any affects of functional choice and basis set on calculation accuracy. The status of this work will be presented at the conference. Work supported by the ARL, DoD HPCMP, and USMA.
Teaching ``The Physics of Energy'' at MIT
NASA Astrophysics Data System (ADS)
Jaffe, Robert
2009-05-01
New physics courses on energy are popping up at colleges and universities across the country. Many require little or no previous physics background, aiming to introduce a broad audience to this complex and critical problem, often augmenting the scientific message with economic and policy discussions. Others are advanced courses, focussing on highly specialized subjects like solar voltaics, nuclear physics, or thermal fluids, for example. About two years ago Washington Taylor and I undertook to develop a course on the ``Physics of Energy'' open to all MIT students who had taken MIT's common core of university level calculus, physics, and chemistry. By avoiding higher level prerequisites, we aimed to attract and make the subject relevant to students in the life sciences, economics, etc. --- as well as physical scientists and engineers --- who want to approach energy issues in a sophisticated and analytical fashion, exploiting their background in calculus, mechanics, and E & M, but without having to take advanced courses in thermodynamics, quantum mechanics, or nuclear physics beforehand. Our object was to interweave teaching the fundamental physics principles at the foundations of energy science with the applications of those principles to energy systems. We envisioned a course that would present the basics of statistical, quantum, and fluid mechanics at a fairly sophisticated level and apply those concepts to the study of energy sources, conversion, transport, losses, storage, conservation, and end use. In the end we developed almost all of the material for the course from scratch. The course debuted this past fall. I will describe what we learned and what general lessons our experience might have for others who contemplate teaching energy physics broadly to a technically sophisticated audience.
Data breach locations, types, and associated characteristics among US hospitals.
Gabriel, Meghan Hufstader; Noblin, Alice; Rutherford, Ashley; Walden, Amanda; Cortelyou-Ward, Kendall
2018-02-01
The objectives of this study were to describe the locations in hospitals where data are breached, the types of breaches that occur most often at hospitals, and hospital characteristics, including health information technology (IT) sophistication and biometric security capabilities, that may be predicting factors of large data breaches that affect 500 or more patients. The Office of Civil Rights breach data from healthcare providers regarding breaches that affected 500 or more individuals from 2009 to 2016 were linked with hospital characteristics from the Health Information Management Systems Society and the American Hospital Association Health IT Supplement databases. Descriptive statistics were used to characterize hospitals with and without breaches, data breach type, and location/mode of data breaches in hospitals. Multivariate logistic regression analysis explored hospital characteristics that were predicting factors of a data breach affecting at least 500 patients, including area characteristics, region, health system membership, size, type, biometric security use, health IT sophistication, and ownership. Of all types of healthcare providers, hospitals accounted for approximately one-third of all data breaches and hospital breaches affected the largest number of individuals. Paper and films were the most frequent location of breached data, occurring in 65 hospitals during the study period, whereas network servers were the least common location but their breaches affected the most patients overall. Adjusted multivariate results showed significant associations among data breach occurrences and some hospital characteristics, including type and size, but not others, including health IT sophistication or biometric use for security. Hospitals should conduct routine audits to allow them to see their vulnerabilities before a breach occurs. Additionally, information security systems should be implemented concurrently with health information technologies. Improving access control and prioritizing patient privacy will be important steps in minimizing future breaches.
Four Educators in Plato's "Theaetetus"
ERIC Educational Resources Information Center
Mintz, Avi I.
2011-01-01
Scholars who have taken interest in "Theaetetus'" educational theme argue that Plato contrasts an inferior, even dangerous, sophistic education to a superior, philosophical, Socratic education. I explore the contrasting exhortations, methods, ideals and epistemological foundations of Socratic and Protagorean education and suggest that Socrates'…
Conceptualizing Effectiveness in Disability Research
ERIC Educational Resources Information Center
de Bruin, Catriona L.
2017-01-01
Policies promoting evidence-based practice in education typically endorse evaluations of the effectiveness of teaching strategies through specific experimental research designs and methods. A number of researchers have critiqued this approach to evaluation as narrow and called for greater methodological sophistication. This paper discusses the…
CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.
Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro
2017-03-30
Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.
Cornuet, Jean-Marie; Santos, Filipe; Beaumont, Mark A.; Robert, Christian P.; Marin, Jean-Michel; Balding, David J.; Guillemaud, Thomas; Estoup, Arnaud
2008-01-01
Summary: Genetic data obtained on population samples convey information about their evolutionary history. Inference methods can extract part of this information but they require sophisticated statistical techniques that have been made available to the biologist community (through computer programs) only for simple and standard situations typically involving a small number of samples. We propose here a computer program (DIY ABC) for inference based on approximate Bayesian computation (ABC), in which scenarios can be customized by the user to fit many complex situations involving any number of populations and samples. Such scenarios involve any combination of population divergences, admixtures and population size changes. DIY ABC can be used to compare competing scenarios, estimate parameters for one or more scenarios and compute bias and precision measures for a given scenario and known values of parameters (the current version applies to unlinked microsatellite data). This article describes key methods used in the program and provides its main features. The analysis of one simulated and one real dataset, both with complex evolutionary scenarios, illustrates the main possibilities of DIY ABC. Availability: The software DIY ABC is freely available at http://www.montpellier.inra.fr/CBGP/diyabc. Contact: j.cornuet@imperial.ac.uk Supplementary information: Supplementary data are also available at http://www.montpellier.inra.fr/CBGP/diyabc PMID:18842597
NASA Astrophysics Data System (ADS)
Craciunescu, Teddy; Peluso, Emmanuele; Murari, Andrea; Gelfusa, Michela; JET Contributors
2018-05-01
The total emission of radiation is a crucial quantity to calculate the power balances and to understand the physics of any Tokamak. Bolometric systems are the main tool to measure this important physical quantity through quite sophisticated tomographic inversion methods. On the Joint European Torus, the coverage of the bolometric diagnostic, due to the availability of basically only two projection angles, is quite limited, rendering the inversion a very ill-posed mathematical problem. A new approach, based on the maximum likelihood, has therefore been developed and implemented to alleviate one of the major weaknesses of traditional tomographic techniques: the difficulty to determine routinely the confidence intervals in the results. The method has been validated by numerical simulations with phantoms to assess the quality of the results and to optimise the configuration of the parameters for the main types of emissivity encountered experimentally. The typical levels of statistical errors, which may significantly influence the quality of the reconstructions, have been identified. The systematic tests with phantoms indicate that the errors in the reconstructions are quite limited and their effect on the total radiated power remains well below 10%. A comparison with other approaches to the inversion and to the regularization has also been performed.
Sawamura, Jitsuki; Morishita, Shigeru; Ishigooka, Jun
2016-01-01
We previously presented a group theoretical model that describes psychiatric patient states or clinical data in a graded vector-like format based on modulo groups. Meanwhile, the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5, the current version), is frequently used for diagnosis in daily psychiatric treatments and biological research. The diagnostic criteria of DSM-5 contain simple binominal items relating to the presence or absence of specific symptoms. In spite of its simple form, the practical structure of the DSM-5 system is not sufficiently systemized for data to be treated in a more rationally sophisticated way. To view the disease states in terms of symmetry in the manner of abstract algebra is considered important for the future systematization of clinical medicine. We provide a simple idea for the practical treatment of the psychiatric diagnosis/score of DSM-5 using depressive symptoms in line with our previously proposed method. An expression is given employing modulo-2 and -7 arithmetic (in particular, additive group theory) for Criterion A of a 'major depressive episode' that must be met for the diagnosis of 'major depressive disorder' in DSM-5. For this purpose, the novel concept of an imaginary value 0 that can be recognized as an explicit 0 or implicit 0 was introduced to compose the model. The zeros allow the incorporation or deletion of an item between any other symptoms if they are ordered appropriately. Optionally, a vector-like expression can be used to rate/select only specific items when modifying the criterion/scale. Simple examples are illustrated concretely. Further development of the proposed method for the criteria/scale of a disease is expected to raise the level of formalism of clinical medicine to that of other fields of natural science.
The discovery and development of HIV therapy: the new challenges.
Perno, Carlo Federico
2011-01-01
The therapy of HIV infection has been dramatically improved over the years, and allowed the achievement of unexpected results. The availability of many drugs, and the knowledge of HIV related pathogenesis, helped in selecting highly effective antiviral therapies, yet today a major challenge stands, that is the selection of the best regimen(s) in clinical practice. In this frame, evidence-based medicine remains a cornerstone of modern medicine, but its structure needs to be adapted to the new challenges, made by an excess of information (not always fully reliable), by highly sophisticated statistical systems that may overlook the clinical practice despite their ability to define the statistical significance, and the limited number of independent controlled studies. The revision of the criteria of evidence-based medicine, and their adaptation to the new tools available, may allow a better contribution to the definition of the best therapy for each single patient.
An Overview of R in Health Decision Sciences.
Jalal, Hawre; Pechlivanoglou, Petros; Krijkamp, Eline; Alarid-Escudero, Fernando; Enns, Eva; Hunink, M G Myriam
2017-10-01
As the complexity of health decision science applications increases, high-level programming languages are increasingly adopted for statistical analyses and numerical computations. These programming languages facilitate sophisticated modeling, model documentation, and analysis reproducibility. Among the high-level programming languages, the statistical programming framework R is gaining increased recognition. R is freely available, cross-platform compatible, and open source. A large community of users who have generated an extensive collection of well-documented packages and functions supports it. These functions facilitate applications of health decision science methodology as well as the visualization and communication of results. Although R's popularity is increasing among health decision scientists, methodological extensions of R in the field of decision analysis remain isolated. The purpose of this article is to provide an overview of existing R functionality that is applicable to the various stages of decision analysis, including model design, input parameter estimation, and analysis of model outputs.
Stationarity: Wanted dead or alive?
Lins, H.F.; Cohn, T.A.
2011-01-01
Aligning engineering practice with natural process behavior would appear, on its face, to be a prudent and reasonable course of action. However, if we do not understand the long-term characteristics of hydroclimatic processes, how does one find the prudent and reasonable course needed for water management? We consider this question in light of three aspects of existing and unresolved issues affecting hydroclimatic variability and statistical inference: Hurst-Kolmogorov phenomena; the complications long-term persistence introduces with respect to statistical understanding; and the dependence of process understanding on arbitrary sampling choices. These problems are not easily addressed. In such circumstances, humility may be more important than physics; a simple model with well-understood flaws may be preferable to a sophisticated model whose correspondence to reality is uncertain. ?? 2011 American Water Resources Association. This article is a U.S. Government work and is in the public domain in the USA.
On the simulation of indistinguishable fermions in the many-body Wigner formalism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sellier, J.M., E-mail: jeanmichel.sellier@gmail.com; Dimov, I.
2015-01-01
The simulation of quantum systems consisting of interacting, indistinguishable fermions is an incredible mathematical problem which poses formidable numerical challenges. Many sophisticated methods addressing this problem are available which are based on the many-body Schrödinger formalism. Recently a Monte Carlo technique for the resolution of the many-body Wigner equation has been introduced and successfully applied to the simulation of distinguishable, spinless particles. This numerical approach presents several advantages over other methods. Indeed, it is based on an intuitive formalism in which quantum systems are described in terms of a quasi-distribution function, and highly scalable due to its Monte Carlo nature.more » In this work, we extend the many-body Wigner Monte Carlo method to the simulation of indistinguishable fermions. To this end, we first show how fermions are incorporated into the Wigner formalism. Then we demonstrate that the Pauli exclusion principle is intrinsic to the formalism. As a matter of fact, a numerical simulation of two strongly interacting fermions (electrons) is performed which clearly shows the appearance of a Fermi (or exchange–correlation) hole in the phase-space, a clear signature of the presence of the Pauli principle. To conclude, we simulate 4, 8 and 16 non-interacting fermions, isolated in a closed box, and show that, as the number of fermions increases, we gradually recover the Fermi–Dirac statistics, a clear proof of the reliability of our proposed method for the treatment of indistinguishable particles.« less
Mechanics of Composite Materials: Past, Present and Future
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1984-01-01
Composite mechanics disciplines are presented and described at their various levels of sophistication and attendant scales of application. Correlation with experimental data is used as the prime discriminator between alternative methods and level of sophistication. Major emphasis is placed on: (1) where composite mechanics has been; (2) what it has accomplished; (3) where it is headed, based on present research activities; and (4) at the risk of being presumptuous, where it should be headed. The discussion is developed using selected, but typical examples of each composite mechanics discipline identifying degree of success, with respect to correlation with experimental data, and problems remaining. The discussion is centered about fiber/resin composites drawn mainly from the author's research activities/experience spanning two decades at Lewis.
Mechanics of composite materials - Past, present and future
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
1989-01-01
Composite mechanics disciplines are presented and described at their various levels of sophistication and attendant scales of application. Correlation with experimental data is used as the prime discriminator between alternative methods and level of sophistication. Major emphasis is placed on: (1) where composite mechanics has been; (2) what it has accomplished; (3) where it is headed, based on present research activities; and (4) at the risk of being presumptuous, where it should be headed. The discussion is developed using selected, but typical examples of each composite mechanics discipline identifying degree of success, with respect to correlation with experimental data, and problems remaining. The discussion is centered about fiber/resin composites drawn mainly from the author's research activities/experience spanning two decades at Lewis.
Jones, Hayley E; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J; Baker, David R; Ades, A E
2014-07-15
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these 'back-calculations', the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. Copyright © 2014. Published by Elsevier B.V.
Jones, Hayley E.; Hickman, Matthew; Kasprzyk-Hordern, Barbara; Welton, Nicky J.; Baker, David R.; Ades, A.E.
2014-01-01
Concentrations of metabolites of illicit drugs in sewage water can be measured with great accuracy and precision, thanks to the development of sensitive and robust analytical methods. Based on assumptions about factors including the excretion profile of the parent drug, routes of administration and the number of individuals using the wastewater system, the level of consumption of a drug can be estimated from such measured concentrations. When presenting results from these ‘back-calculations’, the multiple sources of uncertainty are often discussed, but are not usually explicitly taken into account in the estimation process. In this paper we demonstrate how these calculations can be placed in a more formal statistical framework by assuming a distribution for each parameter involved, based on a review of the evidence underpinning it. Using a Monte Carlo simulations approach, it is then straightforward to propagate uncertainty in each parameter through the back-calculations, producing a distribution for instead of a single estimate of daily or average consumption. This can be summarised for example by a median and credible interval. To demonstrate this approach, we estimate cocaine consumption in a large urban UK population, using measured concentrations of two of its metabolites, benzoylecgonine and norbenzoylecgonine. We also demonstrate a more sophisticated analysis, implemented within a Bayesian statistical framework using Markov chain Monte Carlo simulation. Our model allows the two metabolites to simultaneously inform estimates of daily cocaine consumption and explicitly allows for variability between days. After accounting for this variability, the resulting credible interval for average daily consumption is appropriately wider, representing additional uncertainty. We discuss possibilities for extensions to the model, and whether analysis of wastewater samples has potential to contribute to a prevalence model for illicit drug use. PMID:24636801
A New Paradigm to Analyze Data Completeness of Patient Data
Nasir, Ayan; Liu, Xinliang
2016-01-01
Summary Background There is a need to develop a tool that will measure data completeness of patient records using sophisticated statistical metrics. Patient data integrity is important in providing timely and appropriate care. Completeness is an important step, with an emphasis on understanding the complex relationships between data fields and their relative importance in delivering care. This tool will not only help understand where data problems are but also help uncover the underlying issues behind them. Objectives Develop a tool that can be used alongside a variety of health care database software packages to determine the completeness of individual patient records as well as aggregate patient records across health care centers and subpopulations. Methods The methodology of this project is encapsulated within the Data Completeness Analysis Package (DCAP) tool, with the major components including concept mapping, CSV parsing, and statistical analysis. Results The results from testing DCAP with Healthcare Cost and Utilization Project (HCUP) State Inpatient Database (SID) data show that this tool is successful in identifying relative data completeness at the patient, subpopulation, and database levels. These results also solidify a need for further analysis and call for hypothesis driven research to find underlying causes for data incompleteness. Conclusion DCAP examines patient records and generates statistics that can be used to determine the completeness of individual patient data as well as the general thoroughness of record keeping in a medical database. DCAP uses a component that is customized to the settings of the software package used for storing patient data as well as a Comma Separated Values (CSV) file parser to determine the appropriate measurements. DCAP itself is assessed through a proof of concept exercise using hypothetical data as well as available HCUP SID patient data. PMID:27484918
On the substance of a sophisticated epistemology
NASA Astrophysics Data System (ADS)
Elby, Andrew; Hammer, David
2001-09-01
Among researchers who study students' epistemologies, a consensus has emerged about what constitutes a sophisticated stance toward scientific knowledge. According to this community consensus, students should understand scientific knowledge as tentative and evolving, rather than certain and unchanging; subjectively tied to scientists' perspectives, rather than objectively inherent in nature; and individually or socially constructed, rather than discovered. Surveys, interview protocols, and other methods used to probe students' beliefs about scientific knowledge broadly reflect this outlook. This article questions the community consensus about epistemological sophistication. We do not suggest that scientific knowledge is objective and fixed; if forced to choose whether knowledge is certain or tentative, with no opportunity to elaborate, we would choose tentative. Instead, our critique consists of two lines of argument. First, the literature fails to distinguish between the correctness and productivity of an epistemological belief. For instance, elementary school students who believe that science is about discovering objective truths to questions, such as whether the earth is round or flat, or whether an asteroid led to the extinction of the dinosaurs, may be more likely to succeed in science than students who believe science is about telling stories that vary with one's perspective. Naïve realism, although incorrect (according to a broad consensus of philosophers and social scientists), may nonetheless be productive for helping those students learn. Second, according to the consensus view as reflected in commonly used surveys, epistemological sophistication consists of believing certain blanket generalizations about the nature of knowledge and learning, generalizations that do not attend to context. These generalizations are neither correct nor productive. For example, it would be unsophisticated for students to view as tentative the idea that the earth is round rather than flat. By contrast, they should take a more tentative stance toward theories of mass extinction. Nonetheless, many surveys and interview protocols tally students as sophisticated not for attending to these contextual nuances, but for subscribing broadly to the view that knowledge is tentative.
Instrumental Surveillance of Water Quality.
ERIC Educational Resources Information Center
Miller, J. A.; And Others
The role analytical instrumentation performs in the surveillance and control of the quality of water resources is reviewed. Commonly performed analyses may range from simple tests for physical parameters to more highly sophisticated radiological or spectrophotometric methods. This publication explores many of these types of water quality analyses…
Censorship: Tactics for Defense.
ERIC Educational Resources Information Center
Lowery, Skip
1998-01-01
Book banners are generally successful because they have a wide network of support, including national coalitions with sophisticated organizational methods--such as electing certain people to school boards. School officials should get organized and devise defensive strategies, such as inviting critics to class, asking what they would like to…
USDA-ARS?s Scientific Manuscript database
For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...
Heterogeneous Structure of Stem Cells Dynamics: Statistical Models and Quantitative Predictions
Bogdan, Paul; Deasy, Bridget M.; Gharaibeh, Burhan; Roehrs, Timo; Marculescu, Radu
2014-01-01
Understanding stem cell (SC) population dynamics is essential for developing models that can be used in basic science and medicine, to aid in predicting cells fate. These models can be used as tools e.g. in studying patho-physiological events at the cellular and tissue level, predicting (mal)functions along the developmental course, and personalized regenerative medicine. Using time-lapsed imaging and statistical tools, we show that the dynamics of SC populations involve a heterogeneous structure consisting of multiple sub-population behaviors. Using non-Gaussian statistical approaches, we identify the co-existence of fast and slow dividing subpopulations, and quiescent cells, in stem cells from three species. The mathematical analysis also shows that, instead of developing independently, SCs exhibit a time-dependent fractal behavior as they interact with each other through molecular and tactile signals. These findings suggest that more sophisticated models of SC dynamics should view SC populations as a collective and avoid the simplifying homogeneity assumption by accounting for the presence of more than one dividing sub-population, and their multi-fractal characteristics. PMID:24769917
A sophisticated simulation for the fracture behavior of concrete material using XFEM
NASA Astrophysics Data System (ADS)
Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili
2017-10-01
The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.
Measurement precision of the anaerobic threshold by means of a portable calorimeter.
Nogueira, Fernando dos Santos; Pompeu, Fernando Augusto Monteiro Sabóia
2010-09-01
Many methods are used for determining the Anaerobic Threshold (AT) by means of sophisticated ergospirometer. To test the AT variation, detected by mathematical models and visual inspection, when low cost ergospirometer is used and intended for clinical application. Seventy nine apparently healthy subjects were volunteers in this study; from these, 57 men. The VO₂(max) and the ventilatory threshold were determined by indirect, open-circuit calorimetry. The electro-enzymatic method was used for analyzing the lactacidemia and direct determination of the Lactate Threshold (LT). The AT was determined by two mathematical methods (MM(RSS) and MM(slope)), based on the gases exchange, and by the log-log visual method, for determining the LT. Two independent investigators determined the AT through visual inspection of three graphs, considering two methods (AT₋(a)= V-slope, EqV; and AT₋(b) = V-slope, EqV and ExCO₂). The data were analyzed by means of parametric statistics for determining the differences between AT₋(a) versus ExCO₂, MM(RSS) and MM(slope); AT-b versus MM(RSS) and MM(slope); and LT versus AT₋(a), AT₋(b), MM(RSS) and MM(slope). The MM(slope) was the only method that presented a significant difference between the AT₋(a) and AT₋(b) (p=0.001), with CV% >15. LT versus MM(slope) did not present significant difference (p=0.274), however, it was observed a high CV (24%). It was concluded that with the low cost equipment, the MM(RSS) and AT₋(a) methods can be used for determining the TAn. The MM(slope) method did not present satisfactory precision to be employed with this equipment.
A Novel Field Deployable Point-of-Care Diagnostic Test for Cutaneous Leishmaniasis
2015-10-01
include localized cutaneous leishmaniasis (LCL), and destructive nasal and oropharyngeal lesions of mucosal leishmaniasis (ML). LCL in the New World...the high costs, personnel training and need of sophisticated equipment. Therefore, novel methods to detect leishmaniasis at the POC are urgently needed...To date, there is no field-standardized molecular method based on DNA amplification coupled with Lateral Flow reading to detect leishmaniasis
Turning up the heat on aircraft structures. [design and analysis for high-temperature conditions
NASA Technical Reports Server (NTRS)
Dobyns, Alan; Saff, Charles; Johns, Robert
1992-01-01
An overview is presented of the current effort in design and development of aircraft structures to achieve the lowest cost for best performance. Enhancements in this area are focused on integrated design, improved design analysis tools, low-cost fabrication techniques, and more sophisticated test methods. 3D CAD/CAM data are becoming the method through which design, manufacturing, and engineering communicate.
Pérez Aparicio, Jesús; Toledano Medina, M Angeles; Lafuente Rosales, Victoria
2007-07-09
Free-choice profile (FCP), developed in the 1980s, is a sensory analysis method that can be carried out by untrained panels. The participants need only to be able to use a scale and be consumers of the product under evaluation. The data are analysed by sophisticated statistical methodologies like Generalized Procrustean Analysis (GPA) or STATIS. To facilitate a wider use of the free-choice profiling procedure, different authors have advocated simpler methods based on principal components analysis (PCA) of merged data sets. The purpose of this work was to apply another easy procedure to this type of data by means of a robust PCA. The most important characteristic of the proposed method is that quality responsible managers could use this methodology without any scale evaluation. Only the free terms generated by the assessors are necessary to apply the script, thus avoiding the error associated with scale utilization by inexpert assessors. Also, it is possible to use the application with missing data and with differences in the assessors' attendance at sessions. An example was performed to generate the descriptors from different orange juice types. The results were compared with the STATIS method and with the PCA on the merged data sets. The samples evaluated were fresh orange juices with differences in storage days and pasteurized, concentrated and orange nectar drinks from different brands. Eighteen assessors with a low-level training program were used in a six-session free-choice profile framework. The results proved that this script could be of use in marketing decisions and product quality program development.
The Use of Electronic Data Capture Tools in Clinical Trials: Web-Survey of 259 Canadian Trials
Jonker, Elizabeth; Sampson, Margaret; Krleža-Jerić, Karmela; Neisa, Angelica
2009-01-01
Background Electronic data capture (EDC) tools provide automated support for data collection, reporting, query resolution, randomization, and validation, among other features, for clinical trials. There is a trend toward greater adoption of EDC tools in clinical trials, but there is also uncertainty about how many trials are actually using this technology in practice. A systematic review of EDC adoption surveys conducted up to 2007 concluded that only 20% of trials are using EDC systems, but previous surveys had weaknesses. Objectives Our primary objective was to estimate the proportion of phase II/III/IV Canadian clinical trials that used an EDC system in 2006 and 2007. The secondary objectives were to investigate the factors that can have an impact on adoption and to develop a scale to assess the extent of sophistication of EDC systems. Methods We conducted a Web survey to estimate the proportion of trials that were using an EDC system. The survey was sent to the Canadian site coordinators for 331 trials. We also developed and validated a scale using Guttman scaling to assess the extent of sophistication of EDC systems. Trials using EDC were compared by the level of sophistication of their systems. Results We had a 78.2% response rate (259/331) for the survey. It is estimated that 41% (95% CI 37.5%-44%) of clinical trials were using an EDC system. Trials funded by academic institutions, government, and foundations were less likely to use an EDC system compared to those sponsored by industry. Also, larger trials tended to be more likely to adopt EDC. The EDC sophistication scale had six levels and a coefficient of reproducibility of 0.901 (P< .001) and a coefficient of scalability of 0.79. There was no difference in sophistication based on the funding source, but pediatric trials were likely to use a more sophisticated EDC system. Conclusion The adoption of EDC systems in clinical trials in Canada is higher than the literature indicated: a large proportion of clinical trials in Canada use some form of automated data capture system. To inform future adoption, research should gather stronger evidence on the costs and benefits of using different EDC systems. PMID:19275984
A Critical Review of Some Qualitative Research Methods Used to Explore Rater Cognition
ERIC Educational Resources Information Center
Suto, Irenka
2012-01-01
Internationally, many assessment systems rely predominantly on human raters to score examinations. Arguably, this facilitates the assessment of multiple sophisticated educational constructs, strengthening assessment validity. It can introduce subjectivity into the scoring process, however, engendering threats to accuracy. The present objectives…
EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS.
ERIC Educational Resources Information Center
FLANIGAN, VIRGINIA; AND OTHERS
THE REPORT CAN BE USED AS A GUIDE IN THE PREPARATION OF EDUCATIONAL SPECIFICATIONS FOR SECONDARY SCHOOLS. NEW CURRICULA, METHODS OF INSTRUCTION, AND TEACHING AIDS ADD TO THE SOPHISTICATION OF EDUCATION. PROGRAMS ENCOMPASS MANY AREAS OF EDUCATION, EACH REQUIRING PROFESSIONAL DECISIONS. THESE DECISIONS MUST BE ORGANIZED INTO WRITTEN SPECIFICATIONS…
USDA-ARS?s Scientific Manuscript database
For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...
Environmental Scanning Practices in Junior, Technical, and Community Colleges.
ERIC Educational Resources Information Center
Friedel, Janice N.; Rosenberg, Dana
1993-01-01
Reports results of a 1991 national survey of environmental scanning practices at two-year institutions. Examines sophistication of scanning efforts; personnel involved; and methods of collecting, compiling, interpreting, communicating, and using scan information. Finds scanning practices in use at 41% of the 601 responding institutions. (PAA)
Elokely, Khaled M; Eldawy, Mohamed A; Elkersh, Mohamed A; El-Moselhy, Tarek F
2011-01-01
A simple spectrofluorometric method has been developed, adapted, and validated for the quantitative estimation of drugs containing α-methylene sulfone/sulfonamide functional groups using N(1)-methylnicotinamide chloride (NMNCl) as fluorogenic agent. The proposed method has been applied successfully to the determination of methyl sulfonyl methane (MSM) (1), tinidazole (2), rofecoxib (3), and nimesulide (4) in pure forms, laboratory-prepared mixtures, pharmaceutical dosage forms, spiked human plasma samples, and in volunteer's blood. The method showed linearity over concentration ranging from 1 to 150 μg/mL, 10 to 1000 ng/mL, 1 to 1800 ng/mL, and 30 to 2100 ng/mL for standard solutions of 1, 2, 3, and 4, respectively, and over concentration ranging from 5 to 150 μg/mL, 10 to 1000 ng/mL, 10 to 1700 ng/mL, and 30 to 2350 ng/mL in spiked human plasma samples of 1, 2, 3, and 4, respectively. The method showed good accuracy, specificity, and precision in both laboratory-prepared mixtures and in spiked human plasma samples. The proposed method is simple, does not need sophisticated instruments, and is suitable for quality control application, bioavailability, and bioequivalency studies. Besides, its detection limits are comparable to other sophisticated chromatographic methods.
Jenkins, Clinton N.; Flocks, J.; Kulp, M.; ,
2006-01-01
Information-processing methods are described that integrate the stratigraphic aspects of large and diverse collections of sea-floor sample data. They efficiently convert common types of sea-floor data into database and GIS (geographical information system) tables, visual core logs, stratigraphic fence diagrams and sophisticated stratigraphic statistics. The input data are held in structured documents, essentially written core logs that are particularly efficient to create from raw input datasets. Techniques are described that permit efficient construction of regional databases consisting of hundreds of cores. The sedimentological observations in each core are located by their downhole depths (metres below sea floor - mbsf) and also by a verbal term that describes the sample 'situation' - a special fraction of the sediment or position in the core. The main processing creates a separate output event for each instance of top, bottom and situation, assigning top-base mbsf values from numeric or, where possible, from word-based relative locational information such as 'core catcher' in reference to sampler device, and recovery or penetration length. The processing outputs represent the sub-bottom as a sparse matrix of over 20 sediment properties of interest, such as grain size, porosity and colour. They can be plotted in a range of core-log programs including an in-built facility that better suits the requirements of sea-floor data. Finally, a suite of stratigraphic statistics are computed, including volumetric grades, overburdens, thicknesses and degrees of layering. ?? The Geological Society of London 2006.
Thomsen, Sarah; Ng, Nawi; Biao, Xu; Bondjers, Göran; Kusnanto, Hari; Liem, Nguyen Tanh; Mavalankar, Dileep; Målqvist, Mats; Diwan, Vinod
2013-01-01
Background The Millennium Development Goals (MDGs) are monitored using national-level statistics, which have shown substantial improvements in many countries. These statistics may be misleading, however, and may divert resources from disadvantaged populations within the same countries that are showing progress. The purpose of this article is to set out the relevance and design of the “Evidence for Policy and Implementation project (EPI-4)”. EPI-4 aims to contribute to the reduction of inequities in the achievement of health-related MDGs in China, India, Indonesia and Vietnam through the promotion of research-informed policymaking. Methods Using a framework provided by the Commission on the Social Determinants of Health (CSDH), we compare national-level MDG targets and results, as well as their social and structural determinants, in China, India, Indonesia and Vietnam. Results To understand country-level MDG achievements it is useful to analyze their social and structural determinants. This analysis is not sufficient, however, to understand within-country inequities. Specialized analyses are required for this purpose, as is discussion and debate of the results with policymakers, which is the aim of the EPI-4 project. Conclusion Reducing health inequities requires sophisticated analyses to identify disadvantaged populations within and between countries, and to determine evidence-based solutions that will make a difference. The EPI-4 project hopes to contribute to this goal. PMID:23490302
Coming up short on nonfinancial performance measurement.
Ittner, Christopher D; Larcker, David F
2003-11-01
Companies in increasing numbers are measuring customer loyalty, employee satisfaction, and other nonfinancial areas of performance that they believe affect profitability. But they've failed to relate these measures to their strategic goals or establish a connection between activities undertaken and financial outcomes achieved. Failure to make such connections has led many companies to misdirect their investments and reward ineffective managers. Extensive field research now shows that businesses make some common mistakes when choosing, analyzing, and acting on their nonfinancial measures. Among these mistakes: They set the wrong performance targets because they focus too much on short-term financial results, and they use metrics that lack strong statistical validity and reliability. As a result, the companies can't demonstrate that improvements in nonfinancial measures actually affect their financial results. The authors lay out a series of steps that will allow companies to realize the genuine promise of nonfinancial performance measures. First, develop a model that proposes a causal relationship between the chosen nonfinancial drivers of strategic success and specific outcomes. Next, take careful inventory of all the data within your company. Then use established statistical methods for validating the assumed relationships and continue to test the model as market conditions evolve. Finally, base action plans on analysis of your findings, and determine whether those plans and their investments actually produce the desired results. Nonfinancial measures will offer little guidance unless you use a process for choosing and analyzing them that relies on sophisticated quantitative and qualitative inquiries into the factors actually contributing to economic results.
An introduction to real-time graphical techniques for analyzing multivariate data
NASA Astrophysics Data System (ADS)
Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner
1987-08-01
Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".
NASA Astrophysics Data System (ADS)
Mohammadian, E.; Hamidi, H.; Azdarpour, A.
2018-05-01
CO2 sequestration is considered as one of the most anticipated methods to mitigate CO2 concentration in the atmosphere. Solubility mechanism is one of the most important and sophisticated mechanisms by which CO2 is rendered immobile while it is being injected into aquifers. A semi-empirical, easy to use model was developed to calculate the solubility of CO2 in NaCl brines with thermodynamic conditions (pressure, temperature) and salinity gradients representative CO2 sequestration in the Malay basin. The model was compared to the previous more sophisticated models and a good consistency was found among the data obtained using the two models. A Sensitivity analysis was also conducted on the model to test its performance beyond its limits.
The First Sophists and the Uses of History.
ERIC Educational Resources Information Center
Jarratt, Susan C.
1987-01-01
Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…
Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application
ERIC Educational Resources Information Center
Kyle, Kristopher; Crossley, Scott A.
2015-01-01
This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…
DOT National Transportation Integrated Search
1977-02-01
The limitations of currently used estimation procedures in socio-economic modeling have been highlighted in the ongoing work of Senge, in which it is shown where more sophisticated estimation procedures may become necessary. One such advanced method ...
How To Teach "Dirty" Books in High School.
ERIC Educational Resources Information Center
O'Malley, William J.
1967-01-01
Today's self-centered, utopian attitudes toward sexual experience compel teachers to avoid both overcaution and over-indulgence in selecting controversial books for classroom use. One method of selection is to rank books in a gradual progression from those requiring little literary and sexual sophistication in the reader to those requiring much…
ERIC Educational Resources Information Center
Begeny, John C.; Krouse, Hailey E.; Brown, Kristina G.; Mann, Courtney M.
2011-01-01
Teacher judgments about students' academic abilities are important for instructional decision making and potential special education entitlement decisions. However, the small number of studies evaluating teachers' judgments are limited methodologically (e.g., sample size, procedural sophistication) and have yet to answer important questions…
Isolation by ion-exchange methods. In Sarker S.D. (ed) Natural Products Isolation, 3rd edition
USDA-ARS?s Scientific Manuscript database
The primary goal of many natural products chemists is to extract, isolate, and characterize specific analytes from complex plant, animal, microbial, and food matrices. To achieve this goal, they rely considerably on highly sophisticated and highly hyphenated modern instrumentation. Yet, the vast maj...
USDA-ARS?s Scientific Manuscript database
As global trade increases, invasive insects inflict increasing economic damage to agriculture and urban landscapes in the United States yearly, despite a sophisticated array of interception methods and quarantine programs designed to exclude their entry. Insects that are hidden inside soil, wood, or...
Detecting Satisficing in Online Surveys
ERIC Educational Resources Information Center
Salifu, Shani
2012-01-01
The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…
Teaching Economic Growth Theory with Data
ERIC Educational Resources Information Center
Elmslie, Bruce T.; Tebaldi, Edinaldo
2010-01-01
Many instructors in subjects such as economics are frequently concerned with how to teach technical material to undergraduate students with limited mathematical backgrounds. One method that has proven successful for the authors is to connect theoretically sophisticated material with actual data. This enables students to see how the theory relates…
Socially Responsible Knowledge and Behaviors: Comparing Upper vs. Lower Classmen
ERIC Educational Resources Information Center
Kozar, Joy M.; Connell, Kim Y. Hiller
2010-01-01
Utilizing a sample of undergraduate students and survey research methods, this study examined knowledge on issues of social responsibility within the apparel and textiles industry, comparing the sophistication among upper- versus lower-classmen. The study also investigated the differences between students in their socially responsible apparel…
Seeking Relevance: American Political Science and America
ERIC Educational Resources Information Center
Maranto, Robert; Woessner, Matthew C.
2012-01-01
In this article, the authors talk about the relevance of American political science and America. Political science has enormous strengths in its highly talented practitioners and sophisticated methods. However, its disconnection from its host society, while not so severe as for fields like English and sociology, nonetheless poses an existential…
NASA Technical Reports Server (NTRS)
Buntine, Wray
1994-01-01
IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Villarreal, Oscar D.; Yu, Lili; Department of Laboratory Medicine, Yancheng Vocational Institute of Health Sciences, Yancheng, Jiangsu 224006
Computing the ligand-protein binding affinity (or the Gibbs free energy) with chemical accuracy has long been a challenge for which many methods/approaches have been developed and refined with various successful applications. False positives and, even more harmful, false negatives have been and still are a common occurrence in practical applications. Inevitable in all approaches are the errors in the force field parameters we obtain from quantum mechanical computation and/or empirical fittings for the intra- and inter-molecular interactions. These errors propagate to the final results of the computed binding affinities even if we were able to perfectly implement the statistical mechanicsmore » of all the processes relevant to a given problem. And they are actually amplified to various degrees even in the mature, sophisticated computational approaches. In particular, the free energy perturbation (alchemical) approaches amplify the errors in the force field parameters because they rely on extracting the small differences between similarly large numbers. In this paper, we develop a hybrid steered molecular dynamics (hSMD) approach to the difficult binding problems of a ligand buried deep inside a protein. Sampling the transition along a physical (not alchemical) dissociation path of opening up the binding cavity- -pulling out the ligand- -closing back the cavity, we can avoid the problem of error amplifications by not relying on small differences between similar numbers. We tested this new form of hSMD on retinol inside cellular retinol-binding protein 1 and three cases of a ligand (a benzylacetate, a 2-nitrothiophene, and a benzene) inside a T4 lysozyme L99A/M102Q(H) double mutant. In all cases, we obtained binding free energies in close agreement with the experimentally measured values. This indicates that the force field parameters we employed are accurate and that hSMD (a brute force, unsophisticated approach) is free from the problem of error amplification suffered by many sophisticated approaches in the literature.« less
Progress in Computational Electron-Molecule Collisions
NASA Astrophysics Data System (ADS)
Rescigno, Tn
1997-10-01
The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.
Statistical genetics concepts and approaches in schizophrenia and related neuropsychiatric research.
Schork, Nicholas J; Greenwood, Tiffany A; Braff, David L
2007-01-01
Statistical genetics is a research field that focuses on mathematical models and statistical inference methodologies that relate genetic variations (ie, naturally occurring human DNA sequence variations or "polymorphisms") to particular traits or diseases (phenotypes) usually from data collected on large samples of families or individuals. The ultimate goal of such analysis is the identification of genes and genetic variations that influence disease susceptibility. Although of extreme interest and importance, the fact that many genes and environmental factors contribute to neuropsychiatric diseases of public health importance (eg, schizophrenia, bipolar disorder, and depression) complicates relevant studies and suggests that very sophisticated mathematical and statistical modeling may be required. In addition, large-scale contemporary human DNA sequencing and related projects, such as the Human Genome Project and the International HapMap Project, as well as the development of high-throughput DNA sequencing and genotyping technologies have provided statistical geneticists with a great deal of very relevant and appropriate information and resources. Unfortunately, the use of these resources and their interpretation are not straightforward when applied to complex, multifactorial diseases such as schizophrenia. In this brief and largely nonmathematical review of the field of statistical genetics, we describe many of the main concepts, definitions, and issues that motivate contemporary research. We also provide a discussion of the most pressing contemporary problems that demand further research if progress is to be made in the identification of genes and genetic variations that predispose to complex neuropsychiatric diseases.
Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.
ERIC Educational Resources Information Center
Allen, James E.
While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…
ERIC Educational Resources Information Center
Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher
2018-01-01
This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…
The Network Structure of Symptoms of the Diagnostic and Statistical Manual of Mental Disorders.
Boschloo, Lynn; van Borkulo, Claudia D; Rhemtulla, Mijke; Keyes, Katherine M; Borsboom, Denny; Schoevers, Robert A
2015-01-01
Although current classification systems have greatly contributed to the reliability of psychiatric diagnoses, they ignore the unique role of individual symptoms and, consequently, potentially important information is lost. The network approach, in contrast, assumes that psychopathology results from the causal interplay between psychiatric symptoms and focuses specifically on these symptoms and their complex associations. By using a sophisticated network analysis technique, this study constructed an empirically based network structure of 120 psychiatric symptoms of twelve major DSM-IV diagnoses using cross-sectional data of the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC, second wave; N = 34,653). The resulting network demonstrated that symptoms within the same diagnosis showed differential associations and indicated that the strategy of summing symptoms, as in current classification systems, leads to loss of information. In addition, some symptoms showed strong connections with symptoms of other diagnoses, and these specific symptom pairs, which both concerned overlapping and non-overlapping symptoms, may help to explain the comorbidity across diagnoses. Taken together, our findings indicated that psychopathology is very complex and can be more adequately captured by sophisticated network models than current classification systems. The network approach is, therefore, promising in improving our understanding of psychopathology and moving our field forward.
Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia
2007-10-01
This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.
Bayes Error Rate Estimation Using Classifier Ensembles
NASA Technical Reports Server (NTRS)
Tumer, Kagan; Ghosh, Joydeep
2003-01-01
The Bayes error rate gives a statistical lower bound on the error achievable for a given classification problem and the associated choice of features. By reliably estimating th is rate, one can assess the usefulness of the feature set that is being used for classification. Moreover, by comparing the accuracy achieved by a given classifier with the Bayes rate, one can quantify how effective that classifier is. Classical approaches for estimating or finding bounds for the Bayes error, in general, yield rather weak results for small sample sizes; unless the problem has some simple characteristics, such as Gaussian class-conditional likelihoods. This article shows how the outputs of a classifier ensemble can be used to provide reliable and easily obtainable estimates of the Bayes error with negligible extra computation. Three methods of varying sophistication are described. First, we present a framework that estimates the Bayes error when multiple classifiers, each providing an estimate of the a posteriori class probabilities, a recombined through averaging. Second, we bolster this approach by adding an information theoretic measure of output correlation to the estimate. Finally, we discuss a more general method that just looks at the class labels indicated by ensem ble members and provides error estimates based on the disagreements among classifiers. The methods are illustrated for artificial data, a difficult four-class problem involving underwater acoustic data, and two problems from the Problem benchmarks. For data sets with known Bayes error, the combiner-based methods introduced in this article outperform existing methods. The estimates obtained by the proposed methods also seem quite reliable for the real-life data sets for which the true Bayes rates are unknown.
Network modelling methods for FMRI.
Smith, Stephen M; Miller, Karla L; Salimi-Khorshidi, Gholamreza; Webster, Matthew; Beckmann, Christian F; Nichols, Thomas E; Ramsey, Joseph D; Woolrich, Mark W
2011-01-15
There is great interest in estimating brain "networks" from FMRI data. This is often attempted by identifying a set of functional "nodes" (e.g., spatial ROIs or ICA maps) and then conducting a connectivity analysis between the nodes, based on the FMRI timeseries associated with the nodes. Analysis methods range from very simple measures that consider just two nodes at a time (e.g., correlation between two nodes' timeseries) to sophisticated approaches that consider all nodes simultaneously and estimate one global network model (e.g., Bayes net models). Many different methods are being used in the literature, but almost none has been carefully validated or compared for use on FMRI timeseries data. In this work we generate rich, realistic simulated FMRI data for a wide range of underlying networks, experimental protocols and problematic confounds in the data, in order to compare different connectivity estimation approaches. Our results show that in general correlation-based approaches can be quite successful, methods based on higher-order statistics are less sensitive, and lag-based approaches perform very poorly. More specifically: there are several methods that can give high sensitivity to network connection detection on good quality FMRI data, in particular, partial correlation, regularised inverse covariance estimation and several Bayes net methods; however, accurate estimation of connection directionality is more difficult to achieve, though Patel's τ can be reasonably successful. With respect to the various confounds added to the data, the most striking result was that the use of functionally inaccurate ROIs (when defining the network nodes and extracting their associated timeseries) is extremely damaging to network estimation; hence, results derived from inappropriate ROI definition (such as via structural atlases) should be regarded with great caution. Copyright © 2010 Elsevier Inc. All rights reserved.
Automated delineation of radiotherapy volumes: are we going in the right direction?
Whitfield, G A; Price, P; Price, G J; Moore, C J
2013-01-01
ABSTRACT. Rapid and accurate delineation of target volumes and multiple organs at risk, within the enduring International Commission on Radiation Units and Measurement framework, is now hugely important in radiotherapy, owing to the rapid proliferation of intensity-modulated radiotherapy and the advent of four-dimensional image-guided adaption. Nevertheless, delineation is still generally clinically performed with little if any machine assistance, even though it is both time-consuming and prone to interobserver variation. Currently available segmentation tools include those based on image greyscale interrogation, statistical shape modelling and body atlas-based methods. However, all too often these are not able to match the accuracy of the expert clinician, which remains the universally acknowledged gold standard. In this article we suggest that current methods are fundamentally limited by their lack of ability to incorporate essential human clinical decision-making into the underlying models. Hybrid techniques that utilise prior knowledge, make sophisticated use of greyscale information and allow clinical expertise to be integrated are needed. This may require a change in focus from automated segmentation to machine-assisted delineation. Similarly, new metrics of image quality reflecting fitness for purpose would be extremely valuable. We conclude that methods need to be developed to take account of the clinician's expertise and honed visual processing capabilities as much as the underlying, clinically meaningful information content of the image data being interrogated. We illustrate our observations and suggestions through our own experiences with two software tools developed as part of research council-funded projects. PMID:23239689
Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.
ERIC Educational Resources Information Center
Blair, Kristine L.
With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…
From Poetry to Prose: Sophistic Rhetoric and the Epistemic Music of Language.
ERIC Educational Resources Information Center
Katz, Steven B.
Much revisionist scholarship has focused on sophistic epistemology and its relationship to the current revival of epistemic rhetoric in the academy. However, few scholars have recognized the sensuous substance of words as sounds, and the role it played in sophistic philosophy and rhetoric. Before the invention of the Greek alphabet, poetry was…
A Bayesian approach to estimate evoked potentials.
Sparacino, Giovanni; Milani, Stefano; Arslan, Edoardo; Cobelli, Claudio
2002-06-01
Several approaches, based on different assumptions and with various degree of theoretical sophistication and implementation complexity, have been developed for improving the measurement of evoked potentials (EP) performed by conventional averaging (CA). In many of these methods, one of the major challenges is the exploitation of a priori knowledge. In this paper, we present a new method where the 2nd-order statistical information on the background EEG and on the unknown EP, necessary for the optimal filtering of each sweep in a Bayesian estimation framework, is, respectively, estimated from pre-stimulus data and obtained through a multiple integration of a white noise process model. The latter model is flexible (i.e. it can be employed for a large class of EP) and simple enough to be easily identifiable from the post-stimulus data thanks to a smoothing criterion. The mean EP is determined as the weighted average of the filtered sweeps, where each weight is inversely proportional to the expected value of the norm of the correspondent filter error, a quantity determinable thanks to the employment of the Bayesian approach. The performance of the new approach is shown on both simulated and real auditory EP. A signal-to-noise ratio enhancement is obtained that can allow the (possibly automatic) identification of peak latencies and amplitudes with less sweeps than those required by CA. For cochlear EP, the method also allows the audiology investigator to gather new and clinically important information. The possibility of handling single-sweep analysis with further development of the method is also addressed.
Exploring Remote Rensing Through The Use Of Readily-Available Classroom Technologies
NASA Astrophysics Data System (ADS)
Rogers, M. A.
2013-12-01
Frontier geoscience research using remotely-sensed satellite observation routinely requires sophisticated and novel remote sensing techniques to succeed. Describing these techniques in an educational format presents significant challenges to the science educator, especially with regards to the professional development setting where a small, but competent audience has limited instructor contact time to develop the necessary understanding. In this presentation, we describe the use of simple and cheaply available technologies, including ultrasonic transducers, FLIR detectors, and even simple web cameras to provide a tangible analogue to sophisticated remote sensing platforms. We also describe methods of curriculum development that leverages the use of these simple devices to teach the fundamentals of remote sensing, resulting in a deeper and more intuitive understanding of the techniques used in modern remote sensing research. Sample workshop itineraries using these techniques are provided as well.
Missing data exploration: highlighting graphical presentation of missing pattern.
Zhang, Zhongheng
2015-12-01
Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations.
Mechanical break junctions: enormous information in a nanoscale package.
Natelson, Douglas
2012-04-24
Mechanical break junctions, particularly those in which a metal tip is repeatedly moved in and out of contact with a metal film, have provided many insights into electronic conduction at the atomic and molecular scale, most often by averaging over many possible junction configurations. This averaging throws away a great deal of information, and Makk et al. in this issue of ACS Nano demonstrate that, with both simulated and real experimental data, more sophisticated two-dimensional analysis methods can reveal information otherwise obscured in simple histograms. As additional measured quantities come into play in break junction experiments, including thermopower, noise, and optical response, these more sophisticated analytic approaches are likely to become even more powerful. While break junctions are not directly practical for useful electronic devices, they are incredibly valuable tools for unraveling the electronic transport physics relevant for ultrascaled nanoelectronics.
Polydiacetylene-Based Liposomes: An "Optical Tongue" for Bacteria Detection and Identification
ERIC Educational Resources Information Center
West, Matthew R.; Hanks, Timothy W.; Watson, Rhett T.
2009-01-01
Food- and water-borne bacteria are a major health concern worldwide. Current detection methods are time-consuming and require sophisticated equipment that is not always readily available. However, new techniques based on nanotechnology are under development that will result in a new generation of sensors. In this experiment, liposomes are…
Cancer Imaging Phenomics Toolkit (CaPTK) | Informatics Technology for Cancer Research (ITCR)
CaPTk is a tool that facilitates translation of highly sophisticated methods that help us gain a comprehensive understanding of the underlying mechanisms of cancer from medical imaging research to the clinic. It replicates basic interactive functionalities of radiological workstations and is distributed under a BSD-style license.
Grimaldi, Loredana
2012-01-01
Recently, there has been a concentrated effort by companies to better understand the needs and desires of their consumers. Such efforts usually employ different and sophisticated analysis techniques for monitoring the consumers preferences and how such consumers perceive the advertising communication campaign from a specific company.
ERIC Educational Resources Information Center
Longberg, Pauline Oliphant
2012-01-01
As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…
ERIC Educational Resources Information Center
Toussaint, Karen A.; Tiger, Jeffrey H.
2012-01-01
Covert self-injurious behavior (i.e., behavior that occurs in the absence of other people) can be difficult to treat. Traditional treatments typically have involved sophisticated methods of observation and often have employed positive punishment procedures. The current study evaluated the effectiveness of a variable momentary differential…
Physics in one dimension: theoretical concepts for quantum many-body systems.
Schönhammer, K
2013-01-09
Various sophisticated approximation methods exist for the description of quantum many-body systems. It was realized early on that the theoretical description can simplify considerably in one-dimensional systems and various exact solutions exist. The focus in this introductory paper is on fermionic systems and the emergence of the Luttinger liquid concept.
DOT National Transportation Integrated Search
2009-03-21
This study investigates all of the generated soils data in an attempt to use the more 'routine' laboratory tests to determine geotechnical design parameters (such as phiangle, cohesion, wet unit weight, unconfined compression, consolidation character...
How Commercial Banks Use the World Wide Web: A Content Analysis.
ERIC Educational Resources Information Center
Leovic, Lydia K.
New telecommunications vehicles expand the possible ways that business is conducted. The hypermedia portion of the Internet, the World Wide Web, is such a telecommunications device. The Web is presently one of the most flexible and dynamic methods for electronic information dissemination. The level of technological sophistication necessary to…
Microcomputer Based Computer-Assisted Learning System: CASTLE.
ERIC Educational Resources Information Center
Garraway, R. W. T.
The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…
Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta
2018-06-11
We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.
Classical and all-floating FETI methods for the simulation of arterial tissues
Augustin, Christoph M.; Holzapfel, Gerhard A.; Steinbach, Olaf
2015-01-01
High-resolution and anatomically realistic computer models of biological soft tissues play a significant role in the understanding of the function of cardiovascular components in health and disease. However, the computational effort to handle fine grids to resolve the geometries as well as sophisticated tissue models is very challenging. One possibility to derive a strongly scalable parallel solution algorithm is to consider finite element tearing and interconnecting (FETI) methods. In this study we propose and investigate the application of FETI methods to simulate the elastic behavior of biological soft tissues. As one particular example we choose the artery which is – as most other biological tissues – characterized by anisotropic and nonlinear material properties. We compare two specific approaches of FETI methods, classical and all-floating, and investigate the numerical behavior of different preconditioning techniques. In comparison to classical FETI, the all-floating approach has not only advantages concerning the implementation but in many cases also concerning the convergence of the global iterative solution method. This behavior is illustrated with numerical examples. We present results of linear elastic simulations to show convergence rates, as expected from the theory, and results from the more sophisticated nonlinear case where we apply a well-known anisotropic model to the realistic geometry of an artery. Although the FETI methods have a great applicability on artery simulations we will also discuss some limitations concerning the dependence on material parameters. PMID:26751957
Ng, Yee-Hong; Bettens, Ryan P A
2016-03-03
Using the method of modified Shepard's interpolation to construct potential energy surfaces of the H2O, O3, and HCOOH molecules, we compute vibrationally averaged isotropic nuclear shielding constants ⟨σ⟩ of the three molecules via quantum diffusion Monte Carlo (QDMC). The QDMC results are compared to that of second-order perturbation theory (PT), to see if second-order PT is adequate for obtaining accurate values of nuclear shielding constants of molecules with large amplitude motions. ⟨σ⟩ computed by the two approaches differ for the hydrogens and carbonyl oxygen of HCOOH, suggesting that for certain molecules such as HCOOH where big displacements away from equilibrium happen (internal OH rotation), ⟨σ⟩ of experimental quality may only be obtainable with the use of more sophisticated and accurate methods, such as quantum diffusion Monte Carlo. The approach of modified Shepard's interpolation is also extended to construct shielding constants σ surfaces of the three molecules. By using a σ surface with the equilibrium geometry as a single data point to compute isotropic nuclear shielding constants for each descendant in the QDMC ensemble representing the ground state wave function, we reproduce the results obtained through ab initio computed σ to within statistical noise. Development of such an approach could thereby alleviate the need for any future costly ab initio σ calculations.
Functional brain imaging in neuropsychology over the past 25 years.
Roalf, David R; Gur, Ruben C
2017-11-01
Outline effects of functional neuroimaging on neuropsychology over the past 25 years. Functional neuroimaging methods and studies will be described that provide a historical context, offer examples of the utility of neuroimaging in specific domains, and discuss the limitations and future directions of neuroimaging in neuropsychology. Tracking the history of publications on functional neuroimaging related to neuropsychology indicates early involvement of neuropsychologists in the development of these methodologies. Initial progress in neuropsychological application of functional neuroimaging has been hampered by costs and the exposure to ionizing radiation. With rapid evolution of functional methods-in particular functional MRI (fMRI)-neuroimaging has profoundly transformed our knowledge of the brain. Its current applications span the spectrum of normative development to clinical applications. The field is moving toward applying sophisticated statistical approaches that will help elucidate distinct neural activation networks associated with specific behavioral domains. The impact of functional neuroimaging on clinical neuropsychology is more circumscribed, but the prospects remain enticing. The theoretical insights and empirical findings of functional neuroimaging have been led by many neuropsychologists and have transformed the field of behavioral neuroscience. Thus far they have had limited effects on the clinical practices of neuropsychologists. Perhaps it is time to add training in functional neuroimaging to the clinical neuropsychologist's toolkit and from there to the clinic or bedside. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Rosthøj, S; Keiding, N; Schmiegelow, K
2012-02-28
Childhood acute lymphoblastic leukaemia is treated with long-term intensive chemotherapy. During the latter part of the treatment, the maintenance therapy, the patients receive oral doses of two cytostatics. The doses are tailored to blood counts measured on a weekly basis, and the treatment is therefore highly dynamic. In 1992-1996, the Nordic Society of Paediatric Haematology and Oncology (NOPHO) conducted a randomised study (NOPHO-ALL-92) to investigate the effect of a new and more sophisticated dynamic treatment strategy. Unexpectedly, the new strategy worsened the outcome for the girls, whereas there were no treatment differences for the boys. There are as yet no general guidelines for optimising the treatment. On basis of the data from this study, our goal is to formulate an alternative dosing strategy. We use recently developed methods proposed by van der Laan et al. to obtain statistical models that may be used in the guidance of how the physicians should assign the doses to the patients to obtain the target of the treatment. We present a possible strategy and discuss the reliability of this strategy. The implementation is complicated, and we touch upon the limitations of the methods in relation to the formulation of alternative dosing strategies for the maintenance therapy. Copyright © 2011 John Wiley & Sons, Ltd.
Limb-Enhancer Genie: An accessible resource of accurate enhancer predictions in the developing limb
Monti, Remo; Barozzi, Iros; Osterwalder, Marco; ...
2017-08-21
Epigenomic mapping of enhancer-associated chromatin modifications facilitates the genome-wide discovery of tissue-specific enhancers in vivo. However, reliance on single chromatin marks leads to high rates of false-positive predictions. More sophisticated, integrative methods have been described, but commonly suffer from limited accessibility to the resulting predictions and reduced biological interpretability. Here we present the Limb-Enhancer Genie (LEG), a collection of highly accurate, genome-wide predictions of enhancers in the developing limb, available through a user-friendly online interface. We predict limb enhancers using a combination of > 50 published limb-specific datasets and clusters of evolutionarily conserved transcription factor binding sites, taking advantage ofmore » the patterns observed at previously in vivo validated elements. By combining different statistical models, our approach outperforms current state-of-the-art methods and provides interpretable measures of feature importance. Our results indicate that including a previously unappreciated score that quantifies tissue-specific nuclease accessibility significantly improves prediction performance. We demonstrate the utility of our approach through in vivo validation of newly predicted elements. Moreover, we describe general features that can guide the type of datasets to include when predicting tissue-specific enhancers genome-wide, while providing an accessible resource to the general biological community and facilitating the functional interpretation of genetic studies of limb malformations.« less
Machine learning patterns for neuroimaging-genetic studies in the cloud.
Da Mota, Benoit; Tudoran, Radu; Costan, Alexandru; Varoquaux, Gaël; Brasche, Goetz; Conrod, Patricia; Lemaitre, Herve; Paus, Tomas; Rietschel, Marcella; Frouin, Vincent; Poline, Jean-Baptiste; Antoniu, Gabriel; Thirion, Bertrand
2014-01-01
Brain imaging is a natural intermediate phenotype to understand the link between genetic information and behavior or brain pathologies risk factors. Massive efforts have been made in the last few years to acquire high-dimensional neuroimaging and genetic data on large cohorts of subjects. The statistical analysis of such data is carried out with increasingly sophisticated techniques and represents a great computational challenge. Fortunately, increasing computational power in distributed architectures can be harnessed, if new neuroinformatics infrastructures are designed and training to use these new tools is provided. Combining a MapReduce framework (TomusBLOB) with machine learning algorithms (Scikit-learn library), we design a scalable analysis tool that can deal with non-parametric statistics on high-dimensional data. End-users describe the statistical procedure to perform and can then test the model on their own computers before running the very same code in the cloud at a larger scale. We illustrate the potential of our approach on real data with an experiment showing how the functional signal in subcortical brain regions can be significantly fit with genome-wide genotypes. This experiment demonstrates the scalability and the reliability of our framework in the cloud with a 2 weeks deployment on hundreds of virtual machines.
Martian cratering 11. Utilizing decameter scale crater populations to study Martian history
NASA Astrophysics Data System (ADS)
Hartmann, W. K.; Daubar, I. J.
2017-03-01
New information has been obtained in recent years regarding formation rates and the production size-frequency distribution (PSFD) of decameter-scale primary Martian craters formed during recent orbiter missions. Here we compare the PSFD of the currently forming small primaries (P) with new data on the PSFD of the total small crater population that includes primaries and field secondaries (P + fS), which represents an average over longer time periods. The two data sets, if used in a combined manner, have extraordinary potential for clarifying not only the evolutionary history and resurfacing episodes of small Martian geological formations (as small as one or few km2) but also possible episodes of recent climatic change. In response to recent discussions of statistical methodologies, we point out that crater counts do not produce idealized statistics, and that inherent uncertainties limit improvements that can be made by more sophisticated statistical analyses. We propose three mutually supportive procedures for interpreting crater counts of small craters in this context. Applications of these procedures support suggestions that topographic features in upper meters of mid-latitude ice-rich areas date only from the last few periods of extreme Martian obliquity, and associated predicted climate excursions.
Bonneville Power Administration Communication Alarm Processor expert system:
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goeltz, R.; Purucker, S.; Tonn, B.
This report describes the Communications Alarm Processor (CAP), a prototype expert system developed for the Bonneville Power Administration by Oak Ridge National Laboratory. The system is designed to receive and diagnose alarms from Bonneville's Microwave Communications System (MCS). The prototype encompasses one of seven branches of the communications network and a subset of alarm systems and alarm types from each system. The expert system employs a backward chaining approach to diagnosing alarms. Alarms are fed into the expert system directly from the communication system via RS232 ports and sophisticated alarm filtering and mailbox software. Alarm diagnoses are presented to operatorsmore » for their review and concurrence before the diagnoses are archived. Statistical software is incorporated to allow analysis of archived data for report generation and maintenance studies. The delivered system resides on a Digital Equipment Corporation VAX 3200 workstation and utilizes Nexpert Object and SAS for the expert system and statistical analysis, respectively. 11 refs., 23 figs., 7 tabs.« less
Grand canonical validation of the bipartite international trade network.
Straka, Mika J; Caldarelli, Guido; Saracco, Fabio
2017-08-01
Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.
Grand canonical validation of the bipartite international trade network
NASA Astrophysics Data System (ADS)
Straka, Mika J.; Caldarelli, Guido; Saracco, Fabio
2017-08-01
Devising strategies for economic development in a globally competitive landscape requires a solid and unbiased understanding of countries' technological advancements and similarities among export products. Both can be addressed through the bipartite representation of the International Trade Network. In this paper, we apply the recently proposed grand canonical projection algorithm to uncover country and product communities. Contrary to past endeavors, our methodology, based on information theory, creates monopartite projections in an unbiased and analytically tractable way. Single links between countries or products represent statistically significant signals, which are not accounted for by null models such as the bipartite configuration model. We find stable country communities reflecting the socioeconomic distinction in developed, newly industrialized, and developing countries. Furthermore, we observe product clusters based on the aforementioned country groups. Our analysis reveals the existence of a complicated structure in the bipartite International Trade Network: apart from the diversification of export baskets from the most basic to the most exclusive products, we observe a statistically significant signal of an export specialization mechanism towards more sophisticated products.
pyblocxs: Bayesian Low-Counts X-ray Spectral Analysis in Sherpa
NASA Astrophysics Data System (ADS)
Siemiginowska, A.; Kashyap, V.; Refsdal, B.; van Dyk, D.; Connors, A.; Park, T.
2011-07-01
Typical X-ray spectra have low counts and should be modeled using the Poisson distribution. However, χ2 statistic is often applied as an alternative and the data are assumed to follow the Gaussian distribution. A variety of weights to the statistic or a binning of the data is performed to overcome the low counts issues. However, such modifications introduce biases or/and a loss of information. Standard modeling packages such as XSPEC and Sherpa provide the Poisson likelihood and allow computation of rudimentary MCMC chains, but so far do not allow for setting a full Bayesian model. We have implemented a sophisticated Bayesian MCMC-based algorithm to carry out spectral fitting of low counts sources in the Sherpa environment. The code is a Python extension to Sherpa and allows to fit a predefined Sherpa model to high-energy X-ray spectral data and other generic data. We present the algorithm and discuss several issues related to the implementation, including flexible definition of priors and allowing for variations in the calibration information.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing.
Xiao, Hao; Sun, Tianyang; Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called "outsourcing", "fragmentation production", and "trade in tasks" has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics.
Bioconductor Workflow for Microbiome Data Analysis: from raw reads to community analyses
Callahan, Ben J.; Sankaran, Kris; Fukuyama, Julia A.; McMurdie, Paul J.; Holmes, Susan P.
2016-01-01
High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or OTU composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, whether parametric or nonparametric. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests and nonparametric testing using community networks and the ggnetwork package. PMID:27508062
NASA Astrophysics Data System (ADS)
Lee, Silvia Wen-Yu; Liang, Jyh-Chong; Tsai, Chin-Chung
2016-10-01
This study investigated the relationships among college students' epistemic beliefs in biology (EBB), conceptions of learning biology (COLB), and strategies of learning biology (SLB). EBB includes four dimensions, namely 'multiple-source,' 'uncertainty,' 'development,' and 'justification.' COLB is further divided into 'constructivist' and 'reproductive' conceptions, while SLB represents deep strategies and surface learning strategies. Questionnaire responses were gathered from 303 college students. The results of the confirmatory factor analysis and structural equation modelling showed acceptable model fits. Mediation testing further revealed two paths with complete mediation. In sum, students' epistemic beliefs of 'uncertainty' and 'justification' in biology were statistically significant in explaining the constructivist and reproductive COLB, respectively; and 'uncertainty' was statistically significant in explaining the deep SLB as well. The results of mediation testing further revealed that 'uncertainty' predicted surface strategies through the mediation of 'reproductive' conceptions; and the relationship between 'justification' and deep strategies was mediated by 'constructivist' COLB. This study provides evidence for the essential roles some epistemic beliefs play in predicting students' learning.
PATHA: Performance Analysis Tool for HPC Applications
Yoo, Wucherl; Koo, Michelle; Cao, Yi; ...
2016-02-18
Large science projects rely on complex workflows to analyze terabytes or petabytes of data. These jobs are often running over thousands of CPU cores and simultaneously performing data accesses, data movements, and computation. It is difficult to identify bottlenecks or to debug the performance issues in these large workflows. In order to address these challenges, we have developed Performance Analysis Tool for HPC Applications (PATHA) using the state-of-art open source big data processing tools. Our framework can ingest system logs to extract key performance measures, and apply the most sophisticated statistical tools and data mining methods on the performance data.more » Furthermore, it utilizes an efficient data processing engine to allow users to interactively analyze a large amount of different types of logs and measurements. To illustrate the functionality of PATHA, we conduct a case study on the workflows from an astronomy project known as the Palomar Transient Factory (PTF). This study processed 1.6 TB of system logs collected on the NERSC supercomputer Edison. Using PATHA, we were able to identify performance bottlenecks, which reside in three tasks of PTF workflow with the dependency on the density of celestial objects.« less
Storing and using health data in a virtual private cloud.
Regola, Nathan; Chawla, Nitesh V
2013-03-13
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon's Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment.
Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors.
Qu, Chen; Bi, Du-Yan; Sui, Ping; Chao, Ai-Nong; Wang, Yun-Fei
2017-09-22
The CMOS (Complementary Metal-Oxide-Semiconductor) is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze), causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF) framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.
Genetic Bases of Stuttering: The State of the Art, 2011
Kraft, Shelly Jo; Yairi, Ehud
2011-01-01
Objective The literature on the genetics of stuttering is reviewed with special reference to the historical development from psychosocial explanations leading up to current biological research of gene identification. Summary A gradual progression has been made from the early crude methods of counting percentages of stuttering probands who have relatives who stutter to recent studies using entire genomes of DNA collected from each participant. Despite the shortcomings of some early studies, investigators have accumulated a substantial body of data showing a large presence of familial stuttering. This encouraged more refined research in the form of twin studies. Concordance rates among twins were sufficiently high to lend additional support to the genetic perspective of stuttering. More sophisticated aggregation studies and segregation analyses followed, producing data that matched recognized genetic models, providing the final ‘go ahead’ to proceed from the behavior/statistical genetics into the sphere of biological genetics. Recent linkage and association studies have begun to reveal contributing genes to the disorder. Conclusion No definitive findings have been made regarding which transmission model, chromosomes, genes, or sex factors are involved in the expression of stuttering in the population at large. Future research and clinical implications are discussed. PMID:22067705
NASA Astrophysics Data System (ADS)
Konik, Arda; Madsen, Mark T.; Sunderland, John J.
2012-10-01
In human emission tomography, combined PET/CT and SPECT/CT cameras provide accurate attenuation maps for sophisticated scatter and attenuation corrections. Having proven their potential, these scanners are being adapted for small animal imaging using similar correction approaches. However, attenuation and scatter effects in small animal imaging are substantially less than in human imaging. Hence, the value of sophisticated corrections is not obvious for small animal imaging considering the additional cost and complexity of these methods. In this study, using GATE Monte Carlo package, we simulated the Inveon small animal SPECT (single pinhole collimator) scanner to find the scatter fractions of various sizes of the NEMA-mouse (diameter: 2-5.5 cm , length: 7 cm), NEMA-rat (diameter: 3-5.5 cm, length: 15 cm) and MOBY (diameter: 2.1-5.5 cm, length: 3.5-9.1 cm) phantoms. The simulations were performed for three radionuclides commonly used in small animal SPECT studies:99mTc (140 keV), 111In (171 keV 90% and 245 keV 94%) and 125I (effective 27.5 keV). For the MOBY phantoms, the total Compton scatter fractions ranged (over the range of phantom sizes) from 4-10% for 99mTc (126-154 keV), 7-16% for 111In (154-188 keV), 3-7% for 111In (220-270 keV) and 17-30% for 125I (15-45 keV) including the scatter contributions from the tungsten collimator, lead shield and air (inside and outside the camera heads). For the NEMA-rat phantoms, the scatter fractions ranged from 10-15% (99mTc), 17-23% 111In: 154-188 keV), 8-12% (111In: 220-270 keV) and 32-40% (125I). Our results suggest that energy window methods based on solely emission data are sufficient for all mouse and most rat studies for 99mTc and 111In. However, more sophisticated methods may be needed for 125I.
Dougherty, Edward R.; Highfield, Roger R.
2016-01-01
The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035
Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R
2016-11-13
The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.
Techniques to derive geometries for image-based Eulerian computations
Dillard, Seth; Buchholz, James; Vigmostad, Sarah; Kim, Hyunggun; Udaykumar, H.S.
2014-01-01
Purpose The performance of three frequently used level set-based segmentation methods is examined for the purpose of defining features and boundary conditions for image-based Eulerian fluid and solid mechanics models. The focus of the evaluation is to identify an approach that produces the best geometric representation from a computational fluid/solid modeling point of view. In particular, extraction of geometries from a wide variety of imaging modalities and noise intensities, to supply to an immersed boundary approach, is targeted. Design/methodology/approach Two- and three-dimensional images, acquired from optical, X-ray CT, and ultrasound imaging modalities, are segmented with active contours, k-means, and adaptive clustering methods. Segmentation contours are converted to level sets and smoothed as necessary for use in fluid/solid simulations. Results produced by the three approaches are compared visually and with contrast ratio, signal-to-noise ratio, and contrast-to-noise ratio measures. Findings While the active contours method possesses built-in smoothing and regularization and produces continuous contours, the clustering methods (k-means and adaptive clustering) produce discrete (pixelated) contours that require smoothing using speckle-reducing anisotropic diffusion (SRAD). Thus, for images with high contrast and low to moderate noise, active contours are generally preferable. However, adaptive clustering is found to be far superior to the other two methods for images possessing high levels of noise and global intensity variations, due to its more sophisticated use of local pixel/voxel intensity statistics. Originality/value It is often difficult to know a priori which segmentation will perform best for a given image type, particularly when geometric modeling is the ultimate goal. This work offers insight to the algorithm selection process, as well as outlining a practical framework for generating useful geometric surfaces in an Eulerian setting. PMID:25750470
Discrimination of particulate matter emission sources using stochastic methods
NASA Astrophysics Data System (ADS)
Szczurek, Andrzej; Maciejewska, Monika; Wyłomańska, Agnieszka; Sikora, Grzegorz; Balcerek, Michał; Teuerle, Marek
2016-12-01
Particulate matter (PM) is one of the criteria pollutants which has been determined as harmful to public health and the environment. For this reason the ability to recognize its emission sources is very important. There are a number of measurement methods which allow to characterize PM in terms of concentration, particles size distribution, and chemical composition. All these information are useful to establish a link between the dust found in the air, its emission sources and influence on human as well as the environment. However, the methods are typically quite sophisticated and not applicable outside laboratories. In this work, we considered PM emission source discrimination method which is based on continuous measurements of PM concentration with a relatively cheap instrument and stochastic analysis of the obtained data. The stochastic analysis is focused on the temporal variation of PM concentration and it involves two steps: (1) recognition of the category of distribution for the data i.e. stable or the domain of attraction of stable distribution and (2) finding best matching distribution out of Gaussian, stable and normal-inverse Gaussian (NIG). We examined six PM emission sources. They were associated with material processing in industrial environment, namely machining and welding aluminum, forged carbon steel and plastic with various tools. As shown by the obtained results, PM emission sources may be distinguished based on statistical distribution of PM concentration variations. Major factor responsible for the differences detectable with our method was the type of material processing and the tool applied. In case different materials were processed by the same tool the distinction of emission sources was difficult. For successful discrimination it was crucial to consider size-segregated mass fraction concentrations. In our opinion the presented approach is very promising. It deserves further study and development.
Menéndez González, Manuel; Suárez-Sanmartin, Esther; García, Ciara; Martínez-Camblor, Pablo; Westman, Eric; Simmons, Andy
2016-03-26
Though a disproportionate rate of atrophy in the medial temporal lobe (MTA) represents a reliable marker of Alzheimer's disease (AD) pathology, measurement of the MTA is not currently widely used in daily clinical practice. This is mainly because the methods available to date are sophisticated and difficult to implement in clinical practice (volumetric methods), are poorly explored (linear and planimetric methods), or lack objectivity (visual rating). Here, we aimed to compare the results of a manual planimetric measure (the yearly rate of absolute atrophy of the medial temporal lobe, 2D-yrA-MTL) with the results of an automated volumetric measure (the yearly rate of atrophy of the hippocampus, 3D-yrA-H). A series of 1.5T MRI studies on 290 subjects in the age range of 65-85 years, including patients with AD (n = 100), mild cognitive impairment (MCI) (n = 100), and matched controls (n = 90) from the AddNeuroMed study, were examined by two independent subgroups of researchers: one in charge of volumetric measures and the other in charge of planimetric measures. The means of both methods were significantly different between AD and the other two diagnostic groups. In the differential diagnosis of AD against controls, 3D-yrA-H performed significantly better than 2D-yrA-MTL while differences were not statistically significant in the differential diagnosis of AD against MCI. Automated volumetry of the hippocampus is superior to manual planimetry of the MTL in the diagnosis of AD. Nevertheless, the 2D-yrAMTL is a simpler method that could be easily implemented in clinical practice when volumetry is not available.
Simple, empirical approach to predict neutron capture cross sections from nuclear masses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couture, Aaron Joseph; Casten, Richard F.; Cakirli, R. B.
Here, neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40%, and has limited predictive power, with predictions from different models rapidly differing by an order ofmore » magnitude a few nucleons from the last measurement.« less
Simple, empirical approach to predict neutron capture cross sections from nuclear masses
Couture, Aaron Joseph; Casten, Richard F.; Cakirli, R. B.
2017-12-20
Here, neutron capture cross sections are essential to understanding the astrophysical s and r processes, the modeling of nuclear reactor design and performance, and for a wide variety of nuclear forensics applications. Often, cross sections are needed for nuclei where experimental measurements are difficult. Enormous effort, over many decades, has gone into attempting to develop sophisticated statistical reaction models to predict these cross sections. Such work has met with some success but is often unable to reproduce measured cross sections to better than 40%, and has limited predictive power, with predictions from different models rapidly differing by an order ofmore » magnitude a few nucleons from the last measurement.« less
"Big Data" in Rheumatology: Intelligent Data Modeling Improves the Quality of Imaging Data.
Landewé, Robert B M; van der Heijde, Désirée
2018-05-01
Analysis of imaging data in rheumatology is a challenge. Reliability of scores is an issue for several reasons. Signal-to-noise ratio of most imaging techniques is rather unfavorable (too little signal in relation to too much noise). Optimal use of all available data may help to increase credibility of imaging data, but knowledge of complicated statistical methodology and the help of skilled statisticians are required. Clinicians should appreciate the merits of sophisticated data modeling and liaise with statisticians to increase the quality of imaging results, as proper imaging studies in rheumatology imply more than a supersensitive imaging technique alone. Copyright © 2018 Elsevier Inc. All rights reserved.
Acoustic emission monitoring of polymer composite materials
NASA Technical Reports Server (NTRS)
Bardenheier, R.
1981-01-01
The techniques of acoustic emission monitoring of polymer composite materials is described. It is highly sensitive, quasi-nondestructive testing method that indicates the origin and behavior of flaws in such materials when submitted to different load exposures. With the use of sophisticated signal analysis methods it is possible the distinguish between different types of failure mechanisms, such as fiber fracture delamination or fiber pull-out. Imperfections can be detected while monitoring complex composite structures by acoustic emission measurements.
Roman sophisticated surface modification methods to manufacture silver counterfeited coins
NASA Astrophysics Data System (ADS)
Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.
2017-11-01
By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
Experimental research of flow servo-valve
NASA Astrophysics Data System (ADS)
Takosoglu, Jakub
Positional control of pneumatic drives is particularly important in pneumatic systems. Some methods of positioning pneumatic cylinders for changeover and tracking control are known. Choking method is the most development-oriented and has the greatest potential. An optimal and effective method, particularly when applied to pneumatic drives, has been searched for a long time. Sophisticated control systems with algorithms utilizing artificial intelligence methods are designed therefor. In order to design the control algorithm, knowledge about real parameters of servo-valves used in control systems of electro-pneumatic servo-drives is required. The paper presents the experimental research of flow servo-valve.
ERIC Educational Resources Information Center
Johnson, Mark M.
2009-01-01
Clay is one of the oldest materials known to humanity and has been used for utilitarian purposes and creative expression since prehistoric times. As civilizations evolved, ceramic materials, techniques, purposes and design all became more sophisticated and expressive. With the addition of different minerals and firing methods, clay was used to…
Perceptions of Biometric Experts on Whether or Not Biometric Modalities Will Combat Identity Fraud
ERIC Educational Resources Information Center
Edo, Galaxy Samson
2012-01-01
Electronic-authentication methods, no matter how sophisticated they are in preventing fraud, must be able to identify people to a reasonable degree of certainty before any credentials are assured (Personix, 2006). User authentication is different from identity verification, and both are separate but vital steps in the process of securing…
ERIC Educational Resources Information Center
Best, Linda M.; Shelley, Daniel J.
2018-01-01
This article examines the effects of the social media applications Facebook, Twitter, Snap Chat/Instagram, Texting and various smartphone applications on academic dishonesty in higher education. The study employed a mixed-methods approach conducted through an emailed question-pro student survey consisting of 20 questions. The results of the study…
[Development of operation patient security detection system].
Geng, Shu-Qin; Tao, Ren-Hai; Zhao, Chao; Wei, Qun
2008-11-01
This paper describes a patient security detection system developed with two dimensional bar codes, wireless communication and removal storage technique. Based on the system, nurses and correlative personnel check code wait operation patient to prevent the defaults. The tests show the system is effective. Its objectivity and currency are more scientific and sophisticated than current traditional method in domestic hospital.
Moral foundations and political attitudes: The moderating role of political sophistication.
Milesi, Patrizia
2016-08-01
Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.
The Social Bayesian Brain: Does Mentalizing Make a Difference When We Learn?
Devaine, Marie; Hollard, Guillaume; Daunizeau, Jean
2014-01-01
When it comes to interpreting others' behaviour, we almost irrepressibly engage in the attribution of mental states (beliefs, emotions…). Such "mentalizing" can become very sophisticated, eventually endowing us with highly adaptive skills such as convincing, teaching or deceiving. Here, sophistication can be captured in terms of the depth of our recursive beliefs, as in "I think that you think that I think…" In this work, we test whether such sophisticated recursive beliefs subtend learning in the context of social interaction. We asked participants to play repeated games against artificial (Bayesian) mentalizing agents, which differ in their sophistication. Critically, we made people believe either that they were playing against each other, or that they were gambling like in a casino. Although both framings are similarly deceiving, participants win against the artificial (sophisticated) mentalizing agents in the social framing of the task, and lose in the non-social framing. Moreover, we find that participants' choice sequences are best explained by sophisticated mentalizing Bayesian learning models only in the social framing. This study is the first demonstration of the added-value of mentalizing on learning in the context of repeated social interactions. Importantly, our results show that we would not be able to decipher intentional behaviour without a priori attributing mental states to others. PMID:25474637
Sousa, Marcelo R; Jones, Jon P; Frind, Emil O; Rudolph, David L
2013-01-01
In contaminant travel from ground surface to groundwater receptors, the time taken in travelling through the unsaturated zone is known as the unsaturated zone time lag. Depending on the situation, this time lag may or may not be significant within the context of the overall problem. A method is presented for assessing the importance of the unsaturated zone in the travel time from source to receptor in terms of estimates of both the absolute and the relative advective times. A choice of different techniques for both unsaturated and saturated travel time estimation is provided. This method may be useful for practitioners to decide whether to incorporate unsaturated processes in conceptual and numerical models and can also be used to roughly estimate the total travel time between points near ground surface and a groundwater receptor. This method was applied to a field site located in a glacial aquifer system in Ontario, Canada. Advective travel times were estimated using techniques with different levels of sophistication. The application of the proposed method indicates that the time lag in the unsaturated zone is significant at this field site and should be taken into account. For this case, sophisticated and simplified techniques lead to similar assessments when the same knowledge of the hydraulic conductivity field is assumed. When there is significant uncertainty regarding the hydraulic conductivity, simplified calculations did not lead to a conclusive decision. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.
1995-01-01
The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.
Missing data exploration: highlighting graphical presentation of missing pattern
2015-01-01
Functions shipped with R base can fulfill many tasks of missing data handling. However, because the data volume of electronic medical record (EMR) system is always very large, more sophisticated methods may be helpful in data management. The article focuses on missing data handling by using advanced techniques. There are three types of missing data, that is, missing completely at random (MCAR), missing at random (MAR) and not missing at random (NMAR). This classification system depends on how missing values are generated. Two packages, Multivariate Imputation by Chained Equations (MICE) and Visualization and Imputation of Missing Values (VIM), provide sophisticated functions to explore missing data pattern. In particular, the VIM package is especially helpful in visual inspection of missing data. Finally, correlation analysis provides information on the dependence of missing data on other variables. Such information is useful in subsequent imputations. PMID:26807411
Development of BEM for ceramic composites
NASA Technical Reports Server (NTRS)
Henry, D. P.; Banerjee, P. K.; Dargush, G. F.
1991-01-01
It is evident that for proper micromechanical analysis of ceramic composites, one needs to use a numerical method that is capable of idealizing the individual fibers or individual bundles of fibers embedded within a three-dimensional ceramic matrix. The analysis must be able to account for high stress or temperature gradients from diffusion of stress or temperature from the fiber to the ceramic matrix and allow for interaction between the fibers through the ceramic matrix. The analysis must be sophisticated enough to deal with the failure of fibers described by a series of increasingly sophisticated constitutive models. Finally, the analysis must deal with micromechanical modeling of the composite under nonlinear thermal and dynamic loading. This report details progress made towards the development of a boundary element code designed for the micromechanical studies of an advanced ceramic composite. Additional effort has been made in generalizing the implementation to allow the program to be applicable to real problems in the aerospace industry.
Pala, Eva M; Dey, Sudip
2016-02-01
Conventional and highly sophisticated analytical methods (Cyria et al., 1989; Massar et al., 2012a) were used to analyze micro-structural and micro-analytical aspects of the blood of snake head fish, Channa gachua, exposed to municipal wastes and city garbage. Red (RBC) and white blood cell (WBC) counts and hemhemoglobin content were found to be higher in pollution affected fish as compared with control. Scanning electron microscopy revealed the occurrence of abnormal erythrocytes such as crenated cells, echinocytes, lobopodial projections, membrane internalization, spherocytes, ruptured cells, contracted cells, depression, and uneven elongation of erythrocyte membranes in fish inhabiting the polluted sites. Energy-dispersive X-ray spectroscopy (EDS) revealed the presence of silicon and lead in the RBCs of pollution affected fish. Significance of the study includes the highly sophisticated analytical approach, which revealed the aforementioned micro-structural abnormalities.
Financial Literacy and Financial Sophistication in the Older Population
Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa
2017-01-01
Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191
Financial Literacy and Financial Sophistication in the Older Population.
Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa
2014-10-01
Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.
Chemistry vs. Physics: A Comparison of How Biology Majors View Each Discipline
NASA Astrophysics Data System (ADS)
Perkins, K. K.; Barbera, J.; Adams, W. K.; Wieman, C. E.
2007-01-01
A student's beliefs about science and learning science may be more or less sophisticated depending on the specific science discipline. In this study, we used the physics and chemistry versions of the Colorado Learning Attitudes about Science Survey (CLASS) to measure student beliefs in the large, introductory physics and chemistry courses, respectively. We compare how biology majors — generally required to take both of the courses — view these two disciplines. We find that these students' beliefs are more sophisticated about physics (more like the experts in that discipline) than they are about chemistry. At the start of the term, the average % Overall Favorable score on the CLASS is 59% in physics and 53% in chemistry. The students' responses are statistically more expert-like in physics than in chemistry on 10 statements (P ⩽ 0.01), indicating that these students think chemistry is more about memorizing disconnected pieces of information and sample problems, and has less to do with the real world. In addition, these students' view of chemistry degraded over the course of the term. Their favorable scores shifted -5.7% and -13.5% in `Overall' and the `Real World Connection' category, respectively, in the physics course, which used a variety of research-based teaching practices, these scores shifted 0.0% and +0.3%, respectively. The chemistry shifts are comparable to those previously observed in traditional introductory physics courses.
NASA Astrophysics Data System (ADS)
Zielinski, Jerzy S.
The dramatic increase in number and volume of digital images produced in medical diagnostics, and the escalating demand for rapid access to these relevant medical data, along with the need for interpretation and retrieval has become of paramount importance to a modern healthcare system. Therefore, there is an ever growing need for processed, interpreted and saved images of various types. Due to the high cost and unreliability of human-dependent image analysis, it is necessary to develop an automated method for feature extraction, using sophisticated mathematical algorithms and reasoning. This work is focused on digital image signal processing of biological and biomedical data in one- two- and three-dimensional space. Methods and algorithms presented in this work were used to acquire data from genomic sequences, breast cancer, and biofilm images. One-dimensional analysis was applied to DNA sequences which were presented as a non-stationary sequence and modeled by a time-dependent autoregressive moving average (TD-ARMA) model. Two-dimensional analyses used 2D-ARMA model and applied it to detect breast cancer from x-ray mammograms or ultrasound images. Three-dimensional detection and classification techniques were applied to biofilm images acquired using confocal laser scanning microscopy. Modern medical images are geometrically arranged arrays of data. The broadening scope of imaging as a way to organize our observations of the biophysical world has led to a dramatic increase in our ability to apply new processing techniques and to combine multiple channels of data into sophisticated and complex mathematical models of physiological function and dysfunction. With explosion of the amount of data produced in a field of biomedicine, it is crucial to be able to construct accurate mathematical models of the data at hand. Two main purposes of signal modeling are: data size conservation and parameter extraction. Specifically, in biomedical imaging we have four key problems that were addressed in this work: (i) registration, i.e. automated methods of data acquisition and the ability to align multiple data sets with each other; (ii) visualization and reconstruction, i.e. the environment in which registered data sets can be displayed on a plane or in multidimensional space; (iii) segmentation, i.e. automated and semi-automated methods to create models of relevant anatomy from images; (iv) simulation and prediction, i.e. techniques that can be used to simulate growth end evolution of researched phenomenon. Mathematical models can not only be used to verify experimental findings, but also to make qualitative and quantitative predictions, that might serve as guidelines for the future development of technology and/or treatment.
2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.
Electromagnetic Imaging Methods for Nondestructive Evaluation Applications
Deng, Yiming; Liu, Xin
2011-01-01
Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693
NASA Technical Reports Server (NTRS)
Johnson, Paul E.; Smith, Milton O.; Adams, John B.
1992-01-01
Algorithms were developed, based on Hapke's (1981) equations, for remote determinations of mineral abundances and particle sizes from reflectance spectra. In this method, spectra are modeled as a function of end-member abundances and illumination/viewing geometry. The method was tested on a laboratory data set. It is emphasized that, although there exist more sophisticated models, the present algorithms are particularly suited for remotely sensed data, where little opportunity exists to independently measure reflectance versus article size and phase function.
Numerical realization of the variational method for generating self-trapped beams
NASA Astrophysics Data System (ADS)
Duque, Erick I.; Lopez-Aguayo, Servando; Malomed, Boris A.
2018-03-01
We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schr\\"odinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.
Biostatistics Series Module 10: Brief Overview of Multivariate Methods.
Hazra, Avijit; Gogtay, Nithya
2017-01-01
Multivariate analysis refers to statistical techniques that simultaneously look at three or more variables in relation to the subjects under investigation with the aim of identifying or clarifying the relationships between them. These techniques have been broadly classified as dependence techniques, which explore the relationship between one or more dependent variables and their independent predictors, and interdependence techniques, that make no such distinction but treat all variables equally in a search for underlying relationships. Multiple linear regression models a situation where a single numerical dependent variable is to be predicted from multiple numerical independent variables. Logistic regression is used when the outcome variable is dichotomous in nature. The log-linear technique models count type of data and can be used to analyze cross-tabulations where more than two variables are included. Analysis of covariance is an extension of analysis of variance (ANOVA), in which an additional independent variable of interest, the covariate, is brought into the analysis. It tries to examine whether a difference persists after "controlling" for the effect of the covariate that can impact the numerical dependent variable of interest. Multivariate analysis of variance (MANOVA) is a multivariate extension of ANOVA used when multiple numerical dependent variables have to be incorporated in the analysis. Interdependence techniques are more commonly applied to psychometrics, social sciences and market research. Exploratory factor analysis and principal component analysis are related techniques that seek to extract from a larger number of metric variables, a smaller number of composite factors or components, which are linearly related to the original variables. Cluster analysis aims to identify, in a large number of cases, relatively homogeneous groups called clusters, without prior information about the groups. The calculation intensive nature of multivariate analysis has so far precluded most researchers from using these techniques routinely. The situation is now changing with wider availability, and increasing sophistication of statistical software and researchers should no longer shy away from exploring the applications of multivariate methods to real-life data sets.
Peripheral optics with bifocal soft and corneal reshaping contact lenses.
Ticak, Anita; Walline, Jeffrey J
2013-01-01
To determine whether bifocal soft contact lenses with a distance center design provide myopic defocus to the peripheral retina similar to corneal reshaping contact lenses. Myopic subjects underwent five cycloplegic autorefraction readings centrally and at 10, 20, and 30 degrees temporally, nasally, superiorly, and inferiorly while wearing Proclear Multifocal "D" contact lenses with a +2.00-diopter add power (CooperVision, Fairport, NY) and after wearing Corneal Refractive Therapy (Paragon Vision Sciences, Mesa, AZ) contact lenses for 2 weeks. Fourteen subjects completed the study. Nine (64%) were female, and 12 (86%) were white. The average (± SD) spherical equivalent noncycloplegic manifest refraction for the right eye was -2.84 ± 1.29 diopters. The average logMAR best-corrected, binocular, high-contrast visual acuity was -0.17 ± 0.15 while wearing the bifocal soft contact lenses and -0.09 ± 0.16 after corneal reshaping contact lens wear (analysis of variance, p = 0.27). The orthokeratology contact lens yielded a more myopic peripheral optical profile than the soft bifocal contact lens at 20 and 30 degrees eccentricity (except inferior at 20 degrees); the two modalities were similar at 10 degrees eccentricity. Our data suggest that the two modalities are dissimilar despite the statistical similarities. The corneal reshaping contact lens shows an increase in relative peripheral myopic refraction, a pattern achieved by other studies, but the bifocal lens does not exhibit such a pattern. The low statistical power of the study could be a reason for lack of providing statistical difference in other positions of gaze, but the graphical representation of the data shows a marked difference in the peripheral optical profile between the two modalities. More sophisticated methods of measuring the peripheral optical profile may be necessary to accurately compare the two modalities and to determine the true optical effect of the bifocal soft contact lens on the peripheral retina.
NASA Astrophysics Data System (ADS)
Gu, Jiangyue
Epistemic beliefs are individuals' beliefs about the nature of knowledge, how knowledge is constructed, and how knowledge can be justified. This study employed a mixed-methods approach to examine: (a) middle and high school students' self-reported epistemic beliefs (quantitative) and epistemic beliefs revealed from practice (qualitative) during a problem-based, scientific inquiry unit, (b) How do middle and high school students' epistemic beliefs contribute to the construction of students' problem solving processes, and (c) how and why do students' epistemic beliefs change by engaging in PBL. Twenty-one middle and high school students participated in a summer science class to investigate local water quality in a 2-week long problem-based learning (PBL) unit. The students worked in small groups to conduct water quality tests at in their local watershed and visited several stakeholders for their investigation. Pretest and posttest versions of the Epistemological Beliefs Questionnaire were conducted to assess students' self-reported epistemic beliefs before and after the unit. I videotaped and interviewed three groups of students during the unit and conducted discourse analysis to examine their epistemic beliefs revealed from scientific inquiry activities and triangulate with their self-reported data. There are three main findings from this study. First, students in this study self-reported relatively sophisticated epistemic beliefs on the pretest. However, the comparison between their self-reported beliefs and beliefs revealed from practice indicated that some students were able to apply sophisticated beliefs during the unit while others failed to do so. The inconsistency between these two types of epistemic beliefs may due to students' inadequate cognitive ability, low validity of self-report measure, and the influence of contextual factors. Second, qualitative analysis indicated that students' epistemic beliefs of the nature of knowing influenced their problem solving processes and construction of arguments during their inquiry activities. Students with more sophisticated epistemic beliefs acquired knowledge, presented solid evidence, and used it to support their claims more effectively than their peers. Third, students' self-reported epistemic beliefs became significantly more sophisticated by engaging in PBL. Findings from this study can potentially help researchers to better understand the relation between students' epistemic beliefs and their scientific inquiry practice,
Kuhn, M A; Burch, M; Chinnock, R E; Fenton, M J
2017-10-01
Intravascular ultrasound (IVUS) has been routinely used in some centers to investigate cardiac allograft vasculopathy in pediatric heart transplant recipients. We present an alternative method using more sophisticated imaging software. This study presents a comparison of this method with an established standard method. All patients who had IVUS performed in 2014 were retrospectively evaluated. The standard technique consisted of analysis of 10 operator-selected segments along the vessel. Each study was re-evaluated using a longitudinal technique, taken at every third cardiac cycle, along the entire vessel. Semiautomatic edge detection software was used to detect vessel imaging planes. Measurements included outer and inner diameter, total and luminal area, maximal intimal thickness (MIT), and intimal index. Each IVUS was graded for severity using the Stanford classification. All results were given as mean ± standard deviation (SD). Groups were compared using Student t test. A P value <.05 was considered significant. There were 59 IVUS studies performed on 58 patients. There was no statistically significant difference between outer diameter, inner diameter, or total area. In the longitudinal group, there was a significantly smaller luminal area, higher MIT, and higher intimal index. Using the longitudinal technique, there was an increase in Stanford classification in 20 patients. The longitudinal technique appeared more sensitive in assessing the degree of cardiac allograft vasculopathy and may play a role in the increase in the degree of thickening seen. It may offer an alternative way of grading severity of cardiac allograft vasculopathy in pediatric heart transplant recipients. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Siu-Siu, Guo; Qingxuan, Shi
2017-03-01
In this paper, single-degree-of-freedom (SDOF) systems combined to Gaussian white noise and Gaussian/non-Gaussian colored noise excitations are investigated. By expressing colored noise excitation as a second-order filtered white noise process and introducing colored noise as an additional state variable, the equation of motion for SDOF system under colored noise is then transferred artificially to multi-degree-of-freedom (MDOF) system under white noise excitations with four-coupled first-order differential equations. As a consequence, corresponding Fokker-Planck-Kolmogorov (FPK) equation governing the joint probabilistic density function (PDF) of state variables increases to 4-dimension (4-D). Solution procedure and computer programme become much more sophisticated. The exponential-polynomial closure (EPC) method, widely applied for cases of SDOF systems under white noise excitations, is developed and improved for cases of systems under colored noise excitations and for solving the complex 4-D FPK equation. On the other hand, Monte Carlo simulation (MCS) method is performed to test the approximate EPC solutions. Two examples associated with Gaussian and non-Gaussian colored noise excitations are considered. Corresponding band-limited power spectral densities (PSDs) for colored noise excitations are separately given. Numerical studies show that the developed EPC method provides relatively accurate estimates of the stationary probabilistic solutions, especially the ones in the tail regions of the PDFs. Moreover, statistical parameter of mean-up crossing rate (MCR) is taken into account, which is important for reliability and failure analysis. Hopefully, our present work could provide insights into the investigation of structures under random loadings.
Cannon, Tyrone D; Thompson, Paul M; van Erp, Theo G M; Huttunen, Matti; Lonnqvist, Jouko; Kaprio, Jaakko; Toga, Arthur W
2006-01-01
There is an urgent need to decipher the complex nature of genotype-phenotype relationships within the multiple dimensions of brain structure and function that are compromised in neuropsychiatric syndromes such as schizophrenia. Doing so requires sophisticated methodologies to represent population variability in neural traits and to probe their heritable and molecular genetic bases. We have recently developed and applied computational algorithms to map the heritability of, as well as genetic linkage and association to, neural features encoded using brain imaging in the context of three-dimensional (3D), populationbased, statistical brain atlases. One set of algorithms builds on our prior work using classical twin study methods to estimate heritability by fitting biometrical models for additive genetic, unique, and common environmental influences. Another set of algorithms performs regression-based (Haseman-Elston) identical-bydescent linkage analysis and genetic association analysis of DNA polymorphisms in relation to neural traits of interest in the same 3D population-based brain atlas format. We demonstrate these approaches using samples of healthy monozygotic (MZ) and dizygotic (DZ) twin pairs, as well as MZ and DZ twin pairs discordant for schizophrenia, but the methods can be generalized to other classes of relatives and to other diseases. The results confirm prior evidence of genetic influences on gray matter density in frontal brain regions. They also provide converging evidence that the chromosome 1q42 region is relevant to schizophrenia by demonstrating linkage and association of markers of the Transelin-Associated-Factor-X and Disrupted-In- Schizophrenia-1 genes with prefrontal cortical gray matter deficits in twins discordant for schizophrenia.
An approach for quantitative image quality analysis for CT
NASA Astrophysics Data System (ADS)
Rahimi, Amir; Cochran, Joe; Mooney, Doug; Regensburger, Joe
2016-03-01
An objective and standardized approach to assess image quality of Compute Tomography (CT) systems is required in a wide variety of imaging processes to identify CT systems appropriate for a given application. We present an overview of the framework we have developed to help standardize and to objectively assess CT image quality for different models of CT scanners used for security applications. Within this framework, we have developed methods to quantitatively measure metrics that should correlate with feature identification, detection accuracy and precision, and image registration capabilities of CT machines and to identify strengths and weaknesses in different CT imaging technologies in transportation security. To that end we have designed, developed and constructed phantoms that allow for systematic and repeatable measurements of roughly 88 image quality metrics, representing modulation transfer function, noise equivalent quanta, noise power spectra, slice sensitivity profiles, streak artifacts, CT number uniformity, CT number consistency, object length accuracy, CT number path length consistency, and object registration. Furthermore, we have developed a sophisticated MATLAB based image analysis tool kit to analyze CT generated images of phantoms and report these metrics in a format that is standardized across the considered models of CT scanners, allowing for comparative image quality analysis within a CT model or between different CT models. In addition, we have developed a modified sparse principal component analysis (SPCA) method to generate a modified set of PCA components as compared to the standard principal component analysis (PCA) with sparse loadings in conjunction with Hotelling T2 statistical analysis method to compare, qualify, and detect faults in the tested systems.
Wu, Mengmeng; Zeng, Wanwen; Liu, Wenqiang; Lv, Hairong; Chen, Ting; Jiang, Rui
2018-06-03
Genome-wide association studies (GWAS) have successfully discovered a number of disease-associated genetic variants in the past decade, providing an unprecedented opportunity for deciphering genetic basis of human inherited diseases. However, it is still a challenging task to extract biological knowledge from the GWAS data, due to such issues as missing heritability and weak interpretability. Indeed, the fact that the majority of discovered loci fall into noncoding regions without clear links to genes has been preventing the characterization of their functions and appealing for a sophisticated approach to bridge genetic and genomic studies. Towards this problem, network-based prioritization of candidate genes, which performs integrated analysis of gene networks with GWAS data, has emerged as a promising direction and attracted much attention. However, most existing methods overlook the sparse and noisy properties of gene networks and thus may lead to suboptimal performance. Motivated by this understanding, we proposed a novel method called REGENT for integrating multiple gene networks with GWAS data to prioritize candidate genes for complex diseases. We leveraged a technique called the network representation learning to embed a gene network into a compact and robust feature space, and then designed a hierarchical statistical model to integrate features of multiple gene networks with GWAS data for the effective inference of genes associated with a disease of interest. We applied our method to six complex diseases and demonstrated the superior performance of REGENT over existing approaches in recovering known disease-associated genes. We further conducted a pathway analysis and showed that the ability of REGENT to discover disease-associated pathways. We expect to see applications of our method to a broad spectrum of diseases for post-GWAS analysis. REGENT is freely available at https://github.com/wmmthu/REGENT. Copyright © 2018 Elsevier Inc. All rights reserved.
Monitoring and Evaluation: Statistical Support for Life-cycle Studies, Annual Report 2003.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Skalski, John
2003-11-01
The ongoing mission of this project is the development of statistical tools for analyzing fisheries tagging data in the most precise and appropriate manner possible. This mission also includes providing statistical guidance on the best ways to design large-scale tagging studies. This mission continues because the technologies for conducting fish tagging studies continuously evolve. In just the last decade, fisheries biologists have seen the evolution from freeze-brands and coded wire tags (CWT) to passive integrated transponder (PIT) tags, balloon-tags, radiotelemetry, and now, acoustic-tags. With each advance, the technology holds the promise of more detailed and precise information. However, the technologymore » for analyzing and interpreting the data also becomes more complex as the tagging techniques become more sophisticated. The goal of the project is to develop the analytical tools in parallel with the technical advances in tagging studies, so that maximum information can be extracted on a timely basis. Associated with this mission is the transfer of these analytical capabilities to the field investigators to assure consistency and the highest levels of design and analysis throughout the fisheries community. Consequently, this project provides detailed technical assistance on the design and analysis of tagging studies to groups requesting assistance throughout the fisheries community. Ideally, each project and each investigator would invest in the statistical support needed for the successful completion of their study. However, this is an ideal that is rarely if every attained. Furthermore, there is only a small pool of highly trained scientists in this specialized area of tag analysis here in the Northwest. Project 198910700 provides the financial support to sustain this local expertise on the statistical theory of tag analysis at the University of Washington and make it available to the fisheries community. Piecemeal and fragmented support from various agencies and organizations would be incapable of maintaining a center of expertise. The mission of the project is to help assure tagging studies are designed and analyzed from the onset to extract the best available information using state-of-the-art statistical methods. The overarching goals of the project is to assure statistically sound survival studies so that fish managers can focus on the management implications of their findings and not be distracted by concerns whether the studies are statistically reliable or not. Specific goals and objectives of the study include the following: (1) Provide consistent application of statistical methodologies for survival estimation across all salmon life cycle stages to assure comparable performance measures and assessment of results through time, to maximize learning and adaptive management opportunities, and to improve and maintain the ability to responsibly evaluate the success of implemented Columbia River FWP salmonid mitigation programs and identify future mitigation options. (2) Improve analytical capabilities to conduct research on survival processes of wild and hatchery chinook and steelhead during smolt outmigration, to improve monitoring and evaluation capabilities and assist in-season river management to optimize operational and fish passage strategies to maximize survival. (3) Extend statistical support to estimate ocean survival and in-river survival of returning adults. Provide statistical guidance in implementing a river-wide adult PIT-tag detection capability. (4) Develop statistical methods for survival estimation for all potential users and make this information available through peer-reviewed publications, statistical software, and technology transfers to organizations such as NOAA Fisheries, the Fish Passage Center, US Fish and Wildlife Service, US Geological Survey (USGS), US Army Corps of Engineers (USACE), Public Utility Districts (PUDs), the Independent Scientific Advisory Board (ISAB), and other members of the Northwest fisheries community. (5) Provide and maintain statistical software for tag analysis and user support. (6) Provide improvements in statistical theory and software as requested by user groups. These improvements include extending software capabilities to address new research issues, adapting tagging techniques to new study designs, and extending the analysis capabilities to new technologies such as radio-tags and acoustic-tags.« less
de Heer, Brooke
2016-02-01
Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.
ERIC Educational Resources Information Center
Gold, Stephanie
2005-01-01
The concept of data-driven professional development is both straight-forward and sensible. Implementing this approach is another story, which is why many administrators are turning to sophisticated tools to help manage data collection and analysis. These tools allow educators to assess and correlate student outcomes, instructional methods, and…
Simultaneous master-slave Omega pairs. [navigation system featuring low cost receiver
NASA Technical Reports Server (NTRS)
Burhans, R. W.
1974-01-01
Master-slave sequence ordering of the Omega system is suggested as a method of improving the pair geometry for low-cost receiver user benefit. The sequence change will not affect present sophisticated processor users other than require new labels for some pair combinations, but may require worldwide transmitter operators to slightly alter their long-range synchronizing techniques.
Ralph J. Alig
2004-01-01
Over the past 25 years, renewable resource assessments have addressed demand, supply, and inventory of various renewable resources in increasingly sophisticated fashion, including simulation and optimization analyses of area changes in land uses (e.g., urbanization) and land covers (e.g., plantations vs. naturally regenerated forests). This synthesis reviews related...
Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph
ERIC Educational Resources Information Center
Pizauro, Joao M., Jr.; Ferro, Jesus A.; de Lima, Andrea C. F.; Routman, Karina S.; Portella, Maria Celia
2004-01-01
The present research describes an efficient procedure to obtain high levels of trypsinogen and chymotrypsinogen by using a simple, rapid, and easily reproducible method. The extraction process and the time-course of activation of zymogens can be carried out in a single laboratory period, without sophisticated equipment. The main objective was to…
ERIC Educational Resources Information Center
Erickson, Frederick
The limits and boundaries of anthropology are briefly discussed, along with a general description of lay attitudes towards the field. A research case is given to illustrate the way in which anthropological study methods can contribute to educational research. Noted among these contributions is an informed distrust that anthropologists exhibit…
Structural Uncertainties in Numerical Induction Models
2006-07-01
divide and conquer” modelling approach. Analytical inputs are then assessments, quantitative or qualitative, of the value, performance, or some...said to be naïve because it relies heavily on the inductive method itself. Sophisticated Induction (Logical Positivism ) This form of induction...falters. Popper’s Falsification Karl Popper around 1959 introduced a variant to the above Logical Positivism , known as the inductive-hypothetico
Speed genome editing by transient CRISPR/Cas9 targeting and large DNA fragment deletion.
Luo, Jing; Lu, Liaoxun; Gu, Yanrong; Huang, Rong; Gui, Lin; Li, Saichao; Qi, Xinhui; Zheng, Wenping; Chao, Tianzhu; Zheng, Qianqian; Liang, Yinming; Zhang, Lichen
2018-06-07
Genetic engineering of cell lines and model organisms has been facilitated enormously by the CRISPR/Cas9 system. However, in cell lines it remains labor intensive and time consuming to obtain desirable mutant clones due to the difficulties in isolating the mutated clones and sophisticated genotyping. In this study, we have validated fluorescent protein reporter aided cell sorting which enables the isolation of maximal diversity in mutant cells. We further applied two spectrally distinct fluorescent proteins DsRed2 and ECFP as reporters for independent CRISPR/Cas9 mediated targeting, which allows for one-cell-one-well sorting of the mutant cells. Because of ultra-high efficiency of the CRISPR/Cas9 system with dual reporters and large DNA fragment deletion resulting from independent loci cleavage, monoclonal mutant cells could be easily identified by conventional PCR. In the speed genome editing method presented here, sophisticated genotyping methods are not necessary to identify loss of function mutations after CRISPR/Cas9 genome editing, and desirable loss of function mutant clones could be obtained in less than one month following transfection. Copyright © 2018 Elsevier B.V. All rights reserved.
Current management of overactive bladder.
Cartwright, Rufus; Renganathan, Arasee; Cardozo, Linda
2008-10-01
The concept of overactive bladder has helped us address the problem of urgency and urge incontinence from a symptomatic perspective. In this review, we provide a critical summary of clinically relevant recent publications, focusing in particular on advances in our understanding of assessment methods and therapeutic interventions for overactive bladder in women. According to current definitions, the prevalence of overactive bladder in western nations is now estimated as 13.0%. Although the prevalence increases with age, the symptoms of overactive bladder may follow a relapsing and remitting course. There has been a proliferation of validated symptom and quality of life measures and increasing sophistication in the analysis of bladder diaries. The role of urodynamics in the evaluation of urgency remains uncertain, with many trials showing limited benefit as a preoperative investigation. Fluid restriction and bladder retraining remain important first-line interventions. Many new anticholinergic medications have been licensed, with limited benefits compared with existing preparations. Intravesical botulinum toxin has become a popular alternative for patients who fail oral therapies. Although there have been few important therapeutic innovations, recent publications have led to greater sophistication in assessment methods and a clearer understanding of the role of existing interventions.
Health workforce metrics pre- and post-2015: a stimulus to public policy and planning.
Pozo-Martin, Francisco; Nove, Andrea; Lopes, Sofia Castro; Campbell, James; Buchan, James; Dussault, Gilles; Kunjumen, Teena; Cometto, Giorgio; Siyam, Amani
2017-02-15
Evidence-based health workforce policies are essential to ensure the provision of high-quality health services and to support the attainment of universal health coverage (UHC). This paper describes the main characteristics of available health workforce data for 74 of the 75 countries identified under the 'Countdown to 2015' initiative as accounting for more than 95% of the world's maternal, newborn and child deaths. It also discusses best practices in the development of health workforce metrics post-2015. Using available health workforce data from the Global Health Workforce Statistics database from the Global Health Observatory, we generated descriptive statistics to explore the current status, recent trends in the number of skilled health professionals (SHPs: physicians, nurses, midwives) per 10 000 population, and future requirements to achieve adequate levels of health care in the 74 countries. A rapid literature review was conducted to obtain an overview of the types of methods and the types of data sources used in human resources for health (HRH) studies. There are large intercountry and interregional differences in the density of SHPs to progress towards UHC in Countdown countries: a median of 10.2 per 10 000 population with range 1.6 to 142 per 10 000. Substantial efforts have been made in some countries to increase the availability of SHPs as shown by a positive average exponential growth rate (AEGR) in SHPs in 51% of Countdown countries for which there are data. Many of these countries will require large investments to achieve levels of workforce availability commensurate with UHC and the health-related sustainable development goals (SDGs). The availability, quality and comparability of global health workforce metrics remain limited. Most published workforce studies are descriptive, but more sophisticated needs-based workforce planning methods are being developed. There is a need for high-quality, comprehensive, interoperable sources of HRH data to support all policies towards UHC and the health-related SDGs. The recent WHO-led initiative of supporting countries in the development of National Health Workforce Accounts is a very promising move towards purposive health workforce metrics post-2015. Such data will allow more countries to apply the latest methods for health workforce planning.
Quantifying biodiversity using digital cameras and automated image analysis.
NASA Astrophysics Data System (ADS)
Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.
2009-04-01
Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and enabling automatic deletion of images generated by erroneous triggering (e.g. cloud movements). This is the first step to a hierarchical image processing framework, where situation subclasses such as birds or climatic conditions can be fed into more appropriate automated or semi-automated data mining software.
75 FR 63067 - Interpretation of “Children's Product”
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-14
... a level of sophistication required to operate the locomotives. Additionally, the commenters note... railroad hobbyists, the costs involved, and the level of sophistication required to operate them. Model...
The conceptualization and measurement of cognitive health sophistication.
Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J
2013-01-01
This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
Microbiological testing of Skylab foods.
NASA Technical Reports Server (NTRS)
Heidelbaugh, N. D.; Mcqueen, J. L.; Rowley, D. B.; Powers , E. M.; Bourland, C. T.
1973-01-01
Review of some of the unique food microbiology problems and problem-generating circumstances the Skylab manned space flight program involves. The situations these problems arise from include: extended storage times, variations in storage temperatures, no opportunity to resupply or change foods after launch of the Skylab Workshop, first use of frozen foods in space, first use of a food-warming device in weightlessness, relatively small size of production lots requiring statistically valid sampling plans, and use of food as an accurately controlled part in a set of sophisticated life science experiments. Consideration of all of these situations produced the need for definite microbiological tests and test limits. These tests are described along with the rationale for their selection. Reported test results show good compliance with the test limits.
Surveillance of sexually transmitted infections in England and Wales.
Hughes, G; Paine, T; Thomas, D
2001-05-01
Surveillance of sexually transmitted infections (STIs) in England and Wales has, in the past, relied principally on aggregated statistical data submitted by all genitourinary medicine clinics to the Communicable Disease Surveillance Centre, supplemented by various laboratory reporting systems. Although these systems provide comparatively robust surveillance data, they do not provide sufficient information on risk factors to target STI control and prevention programmes appropriately. Over recent years, substantial rises in STIs, the emergence of numerous outbreaks of STIs, and changes in gonococcal resistance patterns have necessitated the introduction of more sophisticated surveillance mechanisms. This article describes current STI surveillance systems in England and Wales, including new systems that have recently been introduced or are currently being developed to meet the need for enhanced STI surveillance data.
Supervised Classification Techniques for Hyperspectral Data
NASA Technical Reports Server (NTRS)
Jimenez, Luis O.
1997-01-01
The recent development of more sophisticated remote sensing systems enables the measurement of radiation in many mm-e spectral intervals than previous possible. An example of this technology is the AVIRIS system, which collects image data in 220 bands. The increased dimensionality of such hyperspectral data provides a challenge to the current techniques for analyzing such data. Human experience in three dimensional space tends to mislead one's intuition of geometrical and statistical properties in high dimensional space, properties which must guide our choices in the data analysis process. In this paper high dimensional space properties are mentioned with their implication for high dimensional data analysis in order to illuminate the next steps that need to be taken for the next generation of hyperspectral data classifiers.
Cover estimation and payload location using Markov random fields
NASA Astrophysics Data System (ADS)
Quach, Tu-Thach
2014-02-01
Payload location is an approach to find the message bits hidden in steganographic images, but not necessarily their logical order. Its success relies primarily on the accuracy of the underlying cover estimators and can be improved if more estimators are used. This paper presents an approach based on Markov random field to estimate the cover image given a stego image. It uses pairwise constraints to capture the natural two-dimensional statistics of cover images and forms a basis for more sophisticated models. Experimental results show that it is competitive against current state-of-the-art estimators and can locate payload embedded by simple LSB steganography and group-parity steganography. Furthermore, when combined with existing estimators, payload location accuracy improves significantly.
M.S.L.A.P. Modular Spectral Line Analysis Program documentation
NASA Technical Reports Server (NTRS)
Joseph, Charles L.; Jenkins, Edward B.
1991-01-01
MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.
Single image super-resolution reconstruction algorithm based on eage selection
NASA Astrophysics Data System (ADS)
Zhang, Yaolan; Liu, Yijun
2017-05-01
Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.
Berlin, R H; Janzon, B; Rybeck, B; Schantz, B; Seeman, T
1982-01-01
A standard methodology for estimating the energy transfer characteristics of small calibre bullets and other fast missiles is proposed, consisting of firings against targets made of soft soap. The target is evaluated by measuring the size of the permanent cavity remaining in it after the shot. The method is very simple to use and does not require access to any sophisticated measuring equipment. It can be applied under all circumstances, even under field conditions. Adequate methods of calibration to ensure good accuracy are suggested. The precision and limitations of the method are discussed.
Covariance Manipulation for Conjunction Assessment
NASA Technical Reports Server (NTRS)
Hejduk, M. D.
2016-01-01
Use of probability of collision (Pc) has brought sophistication to CA. Made possible by JSpOC precision catalogue because provides covariance. Has essentially replaced miss distance as basic CA parameter. Embrace of Pc has elevated methods to 'manipulate' covariance to enable/improve CA calculations. Two such methods to be examined here; compensation for absent or unreliable covariances through 'Maximum Pc' calculation constructs, projection (not propagation) of epoch covariances forward in time to try to enable better risk assessments. Two questions to be answered about each; situations to which such approaches are properly applicable, amount of utility that such methods offer.
Recognition of isotropic plane target from RCS diagram
NASA Astrophysics Data System (ADS)
Saillard, J.; Chassay, G.
1981-06-01
The use of electromagnetic waves for the recognition of a structure represented by point scatterers is seen as posing a fundamental problem. It is noted that much research has been done on this subject and that the study of aircraft observed in the yaw plane gives interesting results. To apply these methods, however, it is necessary to use many sophisticated acquisition systems. A method is proposed which can be applied to plane structures composed of isotropic scatterers. The method is considered to be of interest because it uses only power measurements and requires only a classical tracking radar.
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Wiebe, Nicholas J P; Meyer, Irmtraud M
2010-06-24
The prediction of functional RNA structures has attracted increased interest, as it allows us to study the potential functional roles of many genes. RNA structure prediction methods, however, assume that there is a unique functional RNA structure and also do not predict functional features required for in vivo folding. In order to understand how functional RNA structures form in vivo, we require sophisticated experiments or reliable prediction methods. So far, there exist only a few, experimentally validated transient RNA structures. On the computational side, there exist several computer programs which aim to predict the co-transcriptional folding pathway in vivo, but these make a range of simplifying assumptions and do not capture all features known to influence RNA folding in vivo. We want to investigate if evolutionarily related RNA genes fold in a similar way in vivo. To this end, we have developed a new computational method, Transat, which detects conserved helices of high statistical significance. We introduce the method, present a comprehensive performance evaluation and show that Transat is able to predict the structural features of known reference structures including pseudo-knotted ones as well as those of known alternative structural configurations. Transat can also identify unstructured sub-sequences bound by other molecules and provides evidence for new helices which may define folding pathways, supporting the notion that homologous RNA sequence not only assume a similar reference RNA structure, but also fold similarly. Finally, we show that the structural features predicted by Transat differ from those assuming thermodynamic equilibrium. Unlike the existing methods for predicting folding pathways, our method works in a comparative way. This has the disadvantage of not being able to predict features as function of time, but has the considerable advantage of highlighting conserved features and of not requiring a detailed knowledge of the cellular environment.
Two statistics for evaluating parameter identifiability and error reduction
Doherty, John; Hunt, Randall J.
2009-01-01
Two statistics are presented that can be used to rank input parameters utilized by a model in terms of their relative identifiability based on a given or possible future calibration dataset. Identifiability is defined here as the capability of model calibration to constrain parameters used by a model. Both statistics require that the sensitivity of each model parameter be calculated for each model output for which there are actual or presumed field measurements. Singular value decomposition (SVD) of the weighted sensitivity matrix is then undertaken to quantify the relation between the parameters and observations that, in turn, allows selection of calibration solution and null spaces spanned by unit orthogonal vectors. The first statistic presented, "parameter identifiability", is quantitatively defined as the direction cosine between a parameter and its projection onto the calibration solution space. This varies between zero and one, with zero indicating complete non-identifiability and one indicating complete identifiability. The second statistic, "relative error reduction", indicates the extent to which the calibration process reduces error in estimation of a parameter from its pre-calibration level where its value must be assigned purely on the basis of prior expert knowledge. This is more sophisticated than identifiability, in that it takes greater account of the noise associated with the calibration dataset. Like identifiability, it has a maximum value of one (which can only be achieved if there is no measurement noise). Conceptually it can fall to zero; and even below zero if a calibration problem is poorly posed. An example, based on a coupled groundwater/surface-water model, is included that demonstrates the utility of the statistics. ?? 2009 Elsevier B.V.
Constructing Noise-Invariant Representations of Sound in the Auditory Pathway
Rabinowitz, Neil C.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.
2013-01-01
Identifying behaviorally relevant sounds in the presence of background noise is one of the most important and poorly understood challenges faced by the auditory system. An elegant solution to this problem would be for the auditory system to represent sounds in a noise-invariant fashion. Since a major effect of background noise is to alter the statistics of the sounds reaching the ear, noise-invariant representations could be promoted by neurons adapting to stimulus statistics. Here we investigated the extent of neuronal adaptation to the mean and contrast of auditory stimulation as one ascends the auditory pathway. We measured these forms of adaptation by presenting complex synthetic and natural sounds, recording neuronal responses in the inferior colliculus and primary fields of the auditory cortex of anaesthetized ferrets, and comparing these responses with a sophisticated model of the auditory nerve. We find that the strength of both forms of adaptation increases as one ascends the auditory pathway. To investigate whether this adaptation to stimulus statistics contributes to the construction of noise-invariant sound representations, we also presented complex, natural sounds embedded in stationary noise, and used a decoding approach to assess the noise tolerance of the neuronal population code. We find that the code for complex sounds in the periphery is affected more by the addition of noise than the cortical code. We also find that noise tolerance is correlated with adaptation to stimulus statistics, so that populations that show the strongest adaptation to stimulus statistics are also the most noise-tolerant. This suggests that the increase in adaptation to sound statistics from auditory nerve to midbrain to cortex is an important stage in the construction of noise-invariant sound representations in the higher auditory brain. PMID:24265596
Real-time Data Display System of the Korean Neonatal Network
Lee, Byong Sop; Moon, Wi Hwan
2015-01-01
Real-time data reporting in clinical research networks can provide network members through interim analyses of the registered data, which can facilitate further studies and quality improvement activities. The aim of this report was to describe the building process of the data display system (DDS) of the Korean Neonatal Network (KNN) and its basic structure. After member verification at the KNN member's site, users can choose a variable of interest that is listed in the in-hospital data statistics (for 90 variables) or in the follow-up data statistics (for 54 variables). The statistical results of the outcome variables are displayed on the HyperText Markup Language 5-based chart graphs and tables. Participating hospitals can compare their performance to those of KNN as a whole and identify the trends over time. Ranking of each participating hospital is also displayed in terms of key outcome variables such as mortality and major neonatal morbidities with the names of other centers blinded. The most powerful function of the DDS is the ability to perform 'conditional filtering' which allows users to exclusively review the records of interest. Further collaboration is needed to upgrade the DDS to a more sophisticated analytical system and to provide a more user-friendly interface. PMID:26566352
Visualizing blood vessel trees in three dimensions: clinical applications
NASA Astrophysics Data System (ADS)
Bullitt, Elizabeth; Aylward, Stephen
2005-04-01
A connected network of blood vessels surrounds and permeates almost every organ of the human body. The ability to define detailed blood vessel trees enables a variety of clinical applications. This paper discusses four such applications and some of the visualization challenges inherent to each. Guidance of endovascular surgery: 3D vessel trees offer important information unavailable by traditional x-ray projection views. How best to combine the 2- and 3D image information is unknown. Planning/guidance of tumor surgery: During tumor resection it is critical to know which blood vessels can be interrupted safely and which cannot. Providing efficient, clear information to the surgeon together with measures of uncertainty in both segmentation and registration can be a complex problem. Vessel-based registration: Vessel-based registration allows pre-and intraoperative images to be registered rapidly. The approach both provides a potential solution to a difficult clinical dilemma and offers a variety of visualization opportunities. Diagnosis/staging of disease: Almost every disease affects blood vessel morphology. The statistical analysis of vessel shape may thus prove to be an important tool in the noninvasive analysis of disease. A plethora of information is available that must be presented meaningfully to the clinician. As medical image analysis methods increase in sophistication, an increasing amount of useful information of varying types will become available to the clinician. New methods must be developed to present a potentially bewildering amount of complex data to individuals who are often accustomed to viewing only tissue slices or flat projection views.
Newson, Robyn; King, Lesley; Rychetnik, Lucie; Bauman, Adrian E; Redman, Sally; Milat, Andrew J; Schroeder, Jacqueline; Cohen, Gillian; Chapman, Simon
2015-01-01
Objectives To investigate researchers’ perceptions about the factors that influenced the policy and practice impacts (or lack of impact) of one of their own funded intervention research studies. Design Mixed method, cross-sectional study. Setting Intervention research conducted in Australia and funded by Australia's National Health and Medical Research Council between 2003 and 2007. Participants The chief investigators from 50 funded intervention research studies were interviewed to determine if their study had achieved policy and practice impacts, how and why these impacts had (or had not) occurred and the approach to dissemination they had employed. Results We found that statistically significant intervention effects and publication of results influenced whether there were policy and practice impacts, along with factors related to the nature of the intervention itself, the researchers’ experience and connections, their dissemination and translation efforts, and the postresearch context. Conclusions This study indicates that sophisticated approaches to intervention development, dissemination actions and translational efforts are actually widespread among experienced researches, and can achieve policy and practice impacts. However, it was the links between the intervention results, further dissemination actions by researchers and a variety of postresearch contextual factors that ultimately determined whether a study had policy and practice impacts. Given the complicated interplay between the various factors, there appears to be no simple formula for determining which intervention studies should be funded in order to achieve optimal policy and practice impacts. PMID:26198428
NASA Astrophysics Data System (ADS)
Zamuruyev, Konstantin O.; Zrodnikov, Yuriy; Davis, Cristina E.
2017-01-01
Excellent chemical and physical properties of glass, over a range of operating conditions, make it a preferred material for chemical detection systems in analytical chemistry, biology, and the environmental sciences. However, it is often compromised with SU8, PDMS, or Parylene materials due to the sophisticated mask preparation requirements for wet etching of glass. Here, we report our efforts toward developing a photolithography-free laser-patterned hydrofluoric acid-resistant chromium-polyimide tape mask for rapid prototyping of microfluidic systems in glass. The patterns are defined in masking layer with a diode-pumped solid-state laser. Minimum feature size is limited to the diameter of the laser beam, 30 µm minimum spacing between features is limited by the thermal shrinkage and adhesive contact of the polyimide tape to 40 µm. The patterned glass substrates are etched in 49% hydrofluoric acid at ambient temperature with soft agitation (in time increments, up to 60 min duration). In spite of the simplicity, our method demonstrates comparable results to the other current more sophisticated masking methods in terms of the etched depth (up to 300 µm in borosilicate glass), feature under etch ratio in isotropic etch (~1.36), and low mask hole density. The method demonstrates high yield and reliability. To our knowledge, this method is the first proposed technique for rapid prototyping of microfluidic systems in glass with such high performance parameters. The proposed method of fabrication can potentially be implemented in research institutions without access to a standard clean-room facility.
Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien
2017-01-01
Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973
Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean
2017-11-01
Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.
Complex Network Analysis for Characterizing Global Value Chains in Equipment Manufacturing
Meng, Bo; Cheng, Lihong
2017-01-01
The rise of global value chains (GVCs) characterized by the so-called “outsourcing”, “fragmentation production”, and “trade in tasks” has been considered one of the most important phenomena for the 21st century trade. GVCs also can play a decisive role in trade policy making. However, due to the increasing complexity and sophistication of international production networks, especially in the equipment manufacturing industry, conventional trade statistics and the corresponding trade indicators may give us a distorted picture of trade. This paper applies various network analysis tools to the new GVC accounting system proposed by Koopman et al. (2014) and Wang et al. (2013) in which gross exports can be decomposed into value-added terms through various routes along GVCs. This helps to divide the equipment manufacturing-related GVCs into some sub-networks with clear visualization. The empirical results of this paper significantly improve our understanding of the topology of equipment manufacturing-related GVCs as well as the interdependency of countries in these GVCs that is generally invisible from the traditional trade statistics. PMID:28081201
Tom, Jennifer A; Sinsheimer, Janet S; Suchard, Marc A
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework.
Tom, Jennifer A.; Sinsheimer, Janet S.; Suchard, Marc A.
2015-01-01
Massive datasets in the gigabyte and terabyte range combined with the availability of increasingly sophisticated statistical tools yield analyses at the boundary of what is computationally feasible. Compromising in the face of this computational burden by partitioning the dataset into more tractable sizes results in stratified analyses, removed from the context that justified the initial data collection. In a Bayesian framework, these stratified analyses generate intermediate realizations, often compared using point estimates that fail to account for the variability within and correlation between the distributions these realizations approximate. However, although the initial concession to stratify generally precludes the more sensible analysis using a single joint hierarchical model, we can circumvent this outcome and capitalize on the intermediate realizations by extending the dynamic iterative reweighting MCMC algorithm. In doing so, we reuse the available realizations by reweighting them with importance weights, recycling them into a now tractable joint hierarchical model. We apply this technique to intermediate realizations generated from stratified analyses of 687 influenza A genomes spanning 13 years allowing us to revisit hypotheses regarding the evolutionary history of influenza within a hierarchical statistical framework. PMID:26681992
Fitting and Modeling in the ASC Data Analysis Environment
NASA Astrophysics Data System (ADS)
Doe, S.; Siemiginowska, A.; Joye, W.; McDowell, J.
As part of the AXAF Science Center (ASC) Data Analysis Environment, we will provide to the astronomical community a Fitting Application. We present a design of the application in this paper. Our design goal is to give the user the flexibility to use a variety of optimization techniques (Levenberg-Marquardt, maximum entropy, Monte Carlo, Powell, downhill simplex, CERN-Minuit, and simulated annealing) and fit statistics (chi (2) , Cash, variance, and maximum likelihood); our modular design allows the user easily to add their own optimization techniques and/or fit statistics. We also present a comparison of the optimization techniques to be provided by the Application. The high spatial and spectral resolutions that will be obtained with AXAF instruments require a sophisticated data modeling capability. We will provide not only a suite of astronomical spatial and spectral source models, but also the capability of combining these models into source models of up to four data dimensions (i.e., into source functions f(E,x,y,t)). We will also provide tools to create instrument response models appropriate for each observation.
The Center for Computational Biology: resources, achievements, and challenges
Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2011-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221
The Center for Computational Biology: resources, achievements, and challenges.
Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott
2012-01-01
The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.
The Organization for Economic Cooperation and Development
2010-02-08
experience of OECD members with bilateral treaties, the increasingly sophisticated methods for tax evasion , and the development of new and more complex...new efforts to curtail the use of tax havens for tax avoidance , combined with efforts since the terrorist...addition, on May 4, 2009, President Obama announced a set of proposals to, “crack down on illegal overseas tax evasion , close loopholes, and make it
3D Chemical Patterning of Micromaterials for Encoded Functionality.
Ceylan, Hakan; Yasa, Immihan Ceren; Sitti, Metin
2017-03-01
Programming local chemical properties of microscale soft materials with 3D complex shapes is indispensable for creating sophisticated functionalities, which has not yet been possible with existing methods. Precise spatiotemporal control of two-photon crosslinking is employed as an enabling tool for 3D patterning of microprinted structures for encoding versatile chemical moieties. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quality of Big Data in Healthcare
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
2015-01-01
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Quality of Big Data in Healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Multifunctional Thermal Structures Using Cellular Contact-Aided Complaint Mechanisms
2016-10-31
control . During this research effort, designs of increasing sophistication consistently outstripped the ability to fabricate them. Basic questions...using non -dimensional models In continuing design research , a topology optimization approach was crafted to maximize the thermal performance of the...methods could conceivably produce the elegant but complex material and geometric designs contemplated. Continued research is needed to improve the
Multifunctional Thermal Structures Using Cellular Contract-Aided Complaint Mechanisms
2017-01-26
control . During this research effort, designs of increasing sophistication consistently outstripped the ability to fabricate them. Basic questions...using non -dimensional models In continuing design research , a topology optimization approach was crafted to maximize the thermal performance of the...methods could conceivably produce the elegant but complex material and geometric designs contemplated. Continued research is needed to improve the
Using a 401(h) account to fund retiree health benefits from your pension plan.
Lee, David; Singerman, Eduardo
2003-06-01
If a health and welfare plan covering retirees faces financial shortfalls, administrators and trustees can fund retiree health benefit payments from a related pension plan that may be in better condition. This method is legal and ethical, but it requires sophisticated accounting techniques for creating an account that provides retiree members with promised benefits while meeting statutory and regulatory requirements.
NASA Astrophysics Data System (ADS)
Boix Mansilla, Veronica Maria
The study presented examined 16 award-winning high school students' beliefs about the criteria by which scientific theories and historical narratives are deemed trustworthy. It sought to (a) describe such beliefs as students reasoned within each discipline; (b) examine the degree to which such beliefs were organized as coherent systems of thought; and (c) explore the relationship between students' beliefs and their prior disciplinary research experience. Students were multiple-year award-winners at the Massachusetts Science Fair and the National History Day---two pre-collegiate State-level competitions. Two consecutive semi-structured interviews invited students to assess and enhance the trustworthiness of competing accounts of genetic inheritance and the Holocaust in science and history respectively. A combined qualitative and quantitative data analysis yielded the following results: (a) Students valued three standards of acceptability that were common across disciplines: e.g. empirical strength, explanatory power and formal and presentational strength. However, when reasoning within each discipline they tended to define each standard in disciplinary-specific ways. Students also valued standards of acceptability that were not shared across disciplines: i.e., external validity in science and human understanding in history. (b) In science, three distinct epistemological orientations were identified---i.e., "faith in method," "trusting the scientific community" and "working against error." In history students held two distinct epistemologies---i.e., "reproducing the past" and "organizing the past". Students' epistemological orientations tended to operate as collections of mutually supporting ideas about what renders a theory or a narrative acceptable. (c) Contrary to the standard position to date in the literature on epistemological beliefs, results revealed that students' research training in a particular discipline (e.g., science or history) was strongly related to the ways in which they interpreted problems, methods, and satisfactory solutions in each domain. Students trained in science favored a sophisticated "working against error" epistemology of science and a naive "reproducing the past" epistemology of history. Students trained in history revealed a sophisticated "organizing the past" epistemology in that discipline and a naive "faith in methods" in one in science. Students trained in both domains revealed sophisticated epistemologies in both disciplines.
Determining Angle of Humeral Torsion Using Image Software Technique.
Patil, Sachin; Sethi, Madhu; Vasudeva, Neelam
2016-10-01
Several researches have been done on the measurement of angles of humeral torsion in different parts of the world. Previously described methods were more complicated, not much accurate, cumbersome or required sophisticated instruments. The present study was conducted with the aim to determine the angles of humeral torsion with a newer simple technique using digital images and image tool software. A total of 250 dry normal adult human humeri were obtained from the bone bank of Department of Anatomy. The length and mid-shaft circumference of each bone was measured with the help of measuring tape. The angle of humeral torsion was measured directly from the digital images by the image analysis using Image Tool 3.0 software program. The data was analysed statistically with SPSS version 17 using unpaired t-test and Spearman's rank order correlation coefficient. The mean angle of torsion was 64.57°±7.56°. On the right side it was 66.84°±9.69°, whereas, on the left side it was found to be 63.31±9.50°. The mean humeral length was 31.6 cm on right side and 30.33 cm on left side. Mid shaft circumference was 5.79 on right side and 5.63 cm on left side. No statistical differences were seen in angles between right and left humeri (p>0.001). From our study, it was concluded that circumference of shaft is inversely proportional to angle of humeral torsion. The length and side of humerus has no relation with the humeral torsion. With advancement of digital technology, it is better to use new image softwares for anatomical studies.
Zhu, Yuerong; Zhu, Yuelin; Xu, Wei
2008-01-01
Background Though microarray experiments are very popular in life science research, managing and analyzing microarray data are still challenging tasks for many biologists. Most microarray programs require users to have sophisticated knowledge of mathematics, statistics and computer skills for usage. With accumulating microarray data deposited in public databases, easy-to-use programs to re-analyze previously published microarray data are in high demand. Results EzArray is a web-based Affymetrix expression array data management and analysis system for researchers who need to organize microarray data efficiently and get data analyzed instantly. EzArray organizes microarray data into projects that can be analyzed online with predefined or custom procedures. EzArray performs data preprocessing and detection of differentially expressed genes with statistical methods. All analysis procedures are optimized and highly automated so that even novice users with limited pre-knowledge of microarray data analysis can complete initial analysis quickly. Since all input files, analysis parameters, and executed scripts can be downloaded, EzArray provides maximum reproducibility for each analysis. In addition, EzArray integrates with Gene Expression Omnibus (GEO) and allows instantaneous re-analysis of published array data. Conclusion EzArray is a novel Affymetrix expression array data analysis and sharing system. EzArray provides easy-to-use tools for re-analyzing published microarray data and will help both novice and experienced users perform initial analysis of their microarray data from the location of data storage. We believe EzArray will be a useful system for facilities with microarray services and laboratories with multiple members involved in microarray data analysis. EzArray is freely available from . PMID:18218103
Sokolova, Elena; Groot, Perry; Claassen, Tom; van Hulzen, Kimm J.; Glennon, Jeffrey C.; Franke, Barbara
2016-01-01
Background Numerous factor analytic studies consistently support a distinction between two symptom domains of attention-deficit/hyperactivity disorder (ADHD), inattention and hyperactivity/impulsivity. Both dimensions show high internal consistency and moderate to strong correlations with each other. However, it is not clear what drives this strong correlation. The aim of this paper is to address this issue. Method We applied a sophisticated approach for causal discovery on three independent data sets of scores of the two ADHD dimensions in NeuroIMAGE (total N = 675), ADHD-200 (N = 245), and IMpACT (N = 164), assessed by different raters and instruments, and further used information on gender or a genetic risk haplotype. Results In all data sets we found strong statistical evidence for the same pattern: the clear dependence between hyperactivity/impulsivity symptom level and an established genetic factor (either gender or risk haplotype) vanishes when one conditions upon inattention symptom level. Under reasonable assumptions, e.g., that phenotypes do not cause genotypes, a causal model that is consistent with this pattern contains a causal path from inattention to hyperactivity/impulsivity. Conclusions The robust dependency cancellation observed in three different data sets suggests that inattention is a driving factor for hyperactivity/impulsivity. This causal hypothesis can be further validated in intervention studies. Our model suggests that interventions that affect inattention will also have an effect on the level of hyperactivity/impulsivity. On the other hand, interventions that affect hyperactivity/impulsivity would not change the level of inattention. This causal model may explain earlier findings on heritable factors causing ADHD reported in the study of twins with learning difficulties. PMID:27768717
NASA Astrophysics Data System (ADS)
Dang, H.; Stayman, J. W.; Sisniega, A.; Xu, J.; Zbijewski, W.; Yorkston, J.; Aygun, N.; Koliatsos, V.; Siewerdsen, J. H.
2015-03-01
Traumatic brain injury (TBI) is a major cause of death and disability. The current front-line imaging modality for TBI detection is CT, which reliably detects intracranial hemorrhage (fresh blood contrast 30-50 HU, size down to 1 mm) in non-contrast-enhanced exams. Compared to CT, flat-panel detector (FPD) cone-beam CT (CBCT) systems offer lower cost, greater portability, and smaller footprint suitable for point-of-care deployment. We are developing FPD-CBCT to facilitate TBI detection at the point-of-care such as in emergent, ambulance, sports, and military applications. However, current FPD-CBCT systems generally face challenges in low-contrast, soft-tissue imaging. Model-based reconstruction can improve image quality in soft-tissue imaging compared to conventional filtered back-projection (FBP) by leveraging high-fidelity forward model and sophisticated regularization. In FPD-CBCT TBI imaging, measurement noise characteristics undergo substantial change following artifact correction, resulting in non-negligible noise amplification. In this work, we extend the penalized weighted least-squares (PWLS) image reconstruction to include the two dominant artifact corrections (scatter and beam hardening) in FPD-CBCT TBI imaging by correctly modeling the variance change following each correction. Experiments were performed on a CBCT test-bench using an anthropomorphic phantom emulating intra-parenchymal hemorrhage in acute TBI, and the proposed method demonstrated an improvement in blood-brain contrast-to-noise ratio (CNR = 14.2) compared to FBP (CNR = 9.6) and PWLS using conventional weights (CNR = 11.6) at fixed spatial resolution (1 mm edge-spread width at the target contrast). The results support the hypothesis that FPD-CBCT can fulfill the image quality requirements for reliable TBI detection, using high-fidelity artifact correction and statistical reconstruction with accurate post-artifact-correction noise models.
An accessible method for implementing hierarchical models with spatio-temporal abundance data
Ross, Beth E.; Hooten, Melvin B.; Koons, David N.
2012-01-01
A common goal in ecology and wildlife management is to determine the causes of variation in population dynamics over long periods of time and across large spatial scales. Many assumptions must nevertheless be overcome to make appropriate inference about spatio-temporal variation in population dynamics, such as autocorrelation among data points, excess zeros, and observation error in count data. To address these issues, many scientists and statisticians have recommended the use of Bayesian hierarchical models. Unfortunately, hierarchical statistical models remain somewhat difficult to use because of the necessary quantitative background needed to implement them, or because of the computational demands of using Markov Chain Monte Carlo algorithms to estimate parameters. Fortunately, new tools have recently been developed that make it more feasible for wildlife biologists to fit sophisticated hierarchical Bayesian models (i.e., Integrated Nested Laplace Approximation, ‘INLA’). We present a case study using two important game species in North America, the lesser and greater scaup, to demonstrate how INLA can be used to estimate the parameters in a hierarchical model that decouples observation error from process variation, and accounts for unknown sources of excess zeros as well as spatial and temporal dependence in the data. Ultimately, our goal was to make unbiased inference about spatial variation in population trends over time.
Binding Sites Analyser (BiSA): Software for Genomic Binding Sites Archiving and Overlap Analysis
Khushi, Matloob; Liddle, Christopher; Clarke, Christine L.; Graham, J. Dinny
2014-01-01
Genome-wide mapping of transcription factor binding and histone modification reveals complex patterns of interactions. Identifying overlaps in binding patterns by different factors is a major objective of genomic studies, but existing methods to archive large numbers of datasets in a personalised database lack sophistication and utility. Therefore we have developed transcription factor DNA binding site analyser software (BiSA), for archiving of binding regions and easy identification of overlap with or proximity to other regions of interest. Analysis results can be restricted by chromosome or base pair overlap between regions or maximum distance between binding peaks. BiSA is capable of reporting overlapping regions that share common base pairs; regions that are nearby; regions that are not overlapping; and average region sizes. BiSA can identify genes located near binding regions of interest, genomic features near a gene or locus of interest and statistical significance of overlapping regions can also be reported. Overlapping results can be visualized as Venn diagrams. A major strength of BiSA is that it is supported by a comprehensive database of publicly available transcription factor binding sites and histone modifications, which can be directly compared to user data. The documentation and source code are available on http://bisa.sourceforge.net PMID:24533055
Storing and Using Health Data in a Virtual Private Cloud
Regola, Nathan
2013-01-01
Electronic health records are being adopted at a rapid rate due to increased funding from the US federal government. Health data provide the opportunity to identify possible improvements in health care delivery by applying data mining and statistical methods to the data and will also enable a wide variety of new applications that will be meaningful to patients and medical professionals. Researchers are often granted access to health care data to assist in the data mining process, but HIPAA regulations mandate comprehensive safeguards to protect the data. Often universities (and presumably other research organizations) have an enterprise information technology infrastructure and a research infrastructure. Unfortunately, both of these infrastructures are generally not appropriate for sensitive research data such as HIPAA, as they require special accommodations on the part of the enterprise information technology (or increased security on the part of the research computing environment). Cloud computing, which is a concept that allows organizations to build complex infrastructures on leased resources, is rapidly evolving to the point that it is possible to build sophisticated network architectures with advanced security capabilities. We present a prototype infrastructure in Amazon’s Virtual Private Cloud to allow researchers and practitioners to utilize the data in a HIPAA-compliant environment. PMID:23485880
Application of a clustering-remote sensing method in analyzing security patterns
NASA Astrophysics Data System (ADS)
López-Caloca, Alejandra; Martínez-Viveros, Elvia; Chapela-Castañares, José Ignacio
2009-04-01
In Mexican academic and government circles, research on criminal spatial behavior has been neglected. Only recently has there been an interest in criminal data geo-reference. However, more sophisticated spatial analyses models are needed to disclose spatial patterns of crime and pinpoint their changes overtime. The main use of these models lies in supporting policy making and strategic intelligence. In this paper we present a model for finding patterns associated with crime. It is based on a fuzzy logic algorithm which finds the best fit within cluster numbers and shapes of groupings. We describe the methodology for building the model and its validation. The model was applied to annual data for types of felonies from 2005 to 2006 in the Mexican city of Hermosillo. The results are visualized as a standard deviational ellipse computed for the points identified to be a "cluster". These areas indicate a high to low demand for public security, and they were cross-related to urban structure analyzed by SPOT images and statistical data such as population, poverty levels, urbanization, and available services. The fusion of the model results with other geospatial data allows detecting obstacles and opportunities for crime commission in specific high risk zones and guide police activities and criminal investigations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wiley, H. S.
There comes a time in every field of science when things suddenly change. While it might not be immediately apparent that things are different, a tipping point has occurred. Biology is now at such a point. The reason is the introduction of high-throughput genomics-based technologies. I am not talking about the consequences of the sequencing of the human genome (and every other genome within reach). The change is due to new technologies that generate an enormous amount of data about the molecular composition of cells. These include proteomics, transcriptional profiling by sequencing, and the ability to globally measure microRNAs andmore » post-translational modifications of proteins. These mountains of digital data can be mapped to a common frame of reference: the organism’s genome. With the new high-throughput technologies, we can generate tens of thousands of data points from each sample. Data are now measured in terabytes and the time necessary to analyze data can now require years. Obviously, we can’t wait to interpret the data fully before the next experiment. In fact, we might never be able to even look at all of it, much less understand it. This volume of data requires sophisticated computational and statistical methods for its analysis and is forcing biologists to approach data interpretation as a collaborative venture.« less
[Macronutrient food sources in a probabilistic sample of Brazilian adults].
de Souza, Danielle Ribeiro; Ados njos, Luiz Antonio; Wahrlich, Vivian; de Vasconcellos, Mauricio Teixeira Leite
2015-05-01
Once it is available, the information on food intake (FI) may enable the development of strategies to intervene, monitor and explore dietary patterns with more sophisticated statistical methods. Thus, the purpose of this study was to document the quantitative dietary characteristics in a probabilistic sample of adults in Niterói in the State of Rio de Janeiro. A 24-hour dietary recall of a typical day was conducted. The food eaten by most adults (> 50%) was white rice, coffee, black beans, refined sugar and French bread. Whole milk was ingested by more adults than skimmed or semi-skimmed milk. Beef was ingested by more adults than chicken, fish or pork. More adults ingested sodas than fruit juices and fruits were eaten by a relatively high percentage of adults (63.3%). The combination of white rice, black beans, beef and French bread was responsible for at least 25% of energy, protein and carbohydrate and 17% of lipids. A total of 65 food items accounted for approximately 90% of energy and macronutrients. The list generated is somewhat similar to the one used in a similar survey conducted in São Paulo. The list can serve as the basis for a single food frequency questionnaire to be used for the southeastern Brazilian urban population.
The state of the art of predicting noise-induced sleep disturbance in field settings.
Fidell, Sanford; Tabachnick, Barbara; Pearsons, Karl S
2010-01-01
Several relationships between intruding noises (largely aircraft) and sleep disturbance have been inferred from the findings of a handful of field studies. Comparisons of sleep disturbance rates predicted by the various relationships are complicated by inconsistent data collection methods and definitions of predictor variables and predicted quantities. None of the relationships is grounded in theory-based understanding, and some depend on questionable statistical assumptions and analysis procedures. The credibility, generalizability, and utility of sleep disturbance predictions are also limited by small and nonrepresentative samples of test participants, and by restricted (airport-specific and relatively short duration) circumstances of exposure. Although expedient relationships may be the best available, their predictions are of only limited utility for policy analysis and regulatory purposes, because they account for very little variance in the association between environmental noise and sleep disturbance, have characteristically shallow slopes, have not been well validated in field settings, are highly context-dependent, and do not squarely address the roles and relative importance of nonacoustic factors in sleep disturbance. Such relationships offer the appearance more than the substance of precision and objectivity. Truly useful, population-level prediction and genuine understanding of noise-induced sleep disturbance will remain beyond reach for the foreseeable future, until the findings of field studies of broader scope and more sophisticated design become available.
How do strategic decisions and operative practices affect operating room productivity?
Peltokorpi, Antti
2011-12-01
Surgical operating rooms are cost-intensive parts of health service production. Managing operating units efficiently is essential when hospitals and healthcare systems aim to maximize health outcomes with limited resources. Previous research about operating room management has focused on studying the effect of management practices and decisions on efficiency by utilizing mainly modeling approach or before-after analysis in single hospital case. The purpose of this research is to analyze the synergic effect of strategic decisions and operative management practices on operating room productivity and to use a multiple case study method enabling statistical hypothesis testing with empirical data. 11 hypotheses that propose connections between the use of strategic and operative practices and productivity were tested in a multi-hospital study that included 26 units. The results indicate that operative practices, such as personnel management, case scheduling and performance measurement, affect productivity more remarkably than do strategic decisions that relate to, e.g., units' size, scope or academic status. Units with different strategic positions should apply different operative practices: Focused hospital units benefit most from sophisticated case scheduling and parallel processing whereas central and ambulatory units should apply flexible working hours, incentives and multi-skilled personnel. Operating units should be more active in applying management practices which are adequate for their strategic orientation.
McCormack, James L; Sittig, Dean F; Wright, Adam; McMullen, Carmit; Bates, David W
2012-01-01
Objective Computerized provider order entry (CPOE) with clinical decision support (CDS) can help hospitals improve care. Little is known about what CDS is presently in use and how it is managed, however, especially in community hospitals. This study sought to address this knowledge gap by identifying standard practices related to CDS in US community hospitals with mature CPOE systems. Materials and Methods Representatives of 34 community hospitals, each of which had over 5 years experience with CPOE, were interviewed to identify standard practices related to CDS. Data were analyzed with a mix of descriptive statistics and qualitative approaches to the identification of patterns, themes and trends. Results This broad sample of community hospitals had robust levels of CDS despite their small size and the independent nature of many of their physician staff members. The hospitals uniformly used medication alerts and order sets, had sophisticated governance procedures for CDS, and employed staff to customize CDS. Discussion The level of customization needed for most CDS before implementation was greater than expected. Customization requires skilled individuals who represent an emerging manpower need at this type of hospital. Conclusion These results bode well for robust diffusion of CDS to similar hospitals in the process of adopting CDS and suggest that national policies to promote CDS use may be successful. PMID:22707744
Office-Based Elastographic Technique for Quantifying Mechanical Properties of Skeletal Muscle
Ballyns, Jeffrey J.; Turo, Diego; Otto, Paul; Shah, Jay P.; Hammond, Jennifer; Gebreab, Tadesse; Gerber, Lynn H.; Sikdar, Siddhartha
2012-01-01
Objectives Our objectives were to develop a new, efficient, and easy-to-administer approach to ultrasound elastography and assess its ability to provide quantitative characterization of viscoelastic properties of skeletal muscle in an outpatient clinical environment. We sought to show its validity and clinical utility in assessing myofascial trigger points, which are associated with myofascial pain syndrome. Methods Ultrasound imaging was performed while the muscle was externally vibrated at frequencies in the range of 60 to 200 Hz using a handheld vibrator. The spatial gradient of the vibration phase yielded the shear wave speed, which is related to the viscoelastic properties of tissue. The method was validated using a calibrated experimental phantom, the biceps brachii muscle in healthy volunteers (n = 6), and the upper trapezius muscle in symptomatic patients with axial neck pain (n = 13) and asymptomatic (pain-free) control participants (n = 9). Results Using the experimental phantom, our method was able to quantitatively measure the shear moduli with error rates of less than 20%. The mean shear modulus ± SD in the normal biceps brachii measured 12.5 ± 3.4 kPa, within the range of published values using more sophisticated methods. Shear wave speeds in active myofascial trigger points and the surrounding muscle tissue were significantly higher than those in normal tissue at high frequency excitations (>100 Hz; P < .05). Conclusions Off-the-shelf office-based equipment can be used to quantitatively characterize skeletal muscle viscoelastic properties with estimates comparable to those using more sophisticated methods. Our preliminary results using this method indicate that patients with spontaneous neck pain and symptomatic myofascial trigger points have increased tissue heterogeneity at the trigger point site and the surrounding muscle tissue. PMID:22837285
Environmental statistics and optimal regulation.
Sivak, David A; Thomson, Matt
2014-09-01
Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies--such as constitutive expression or graded response--for regulating protein levels in response to environmental inputs. We propose a general framework-here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient-to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
NASA Astrophysics Data System (ADS)
Posselt, D.; L'Ecuyer, T.; Matsui, T.
2009-05-01
Cloud resolving models are typically used to examine the characteristics of clouds and precipitation and their relationship to radiation and the large-scale circulation. As such, they are not required to reproduce the exact location of each observed convective system, much less each individual cloud. Some of the most relevant information about clouds and precipitation is provided by instruments located on polar-orbiting satellite platforms, but these observations are intermittent "snapshots" in time, making assessment of model performance challenging. In contrast to direct comparison, model results can be evaluated statistically. This avoids the requirement for the model to reproduce the observed systems, while returning valuable information on the performance of the model in a climate-relevant sense. The focus of this talk is a model evaluation study, in which updates to the microphysics scheme used in a three-dimensional version of the Goddard Cumulus Ensemble (GCE) model are evaluated using statistics of observed clouds, precipitation, and radiation. We present the results of multiday (non-equilibrium) simulations of organized deep convection using single- and double-moment versions of a the model's cloud microphysical scheme. Statistics of TRMM multi-sensor derived clouds, precipitation, and radiative fluxes are used to evaluate the GCE results, as are simulated TRMM measurements obtained using a sophisticated instrument simulator suite. We present advantages and disadvantages of performing model comparisons in retrieval and measurement space and conclude by motivating the use of data assimilation techniques for analyzing and improving model parameterizations.
Preserved Statistical Learning of Tonal and Linguistic Material in Congenital Amusia
Omigie, Diana; Stewart, Lauren
2011-01-01
Congenital amusia is a lifelong disorder whereby individuals have pervasive difficulties in perceiving and producing music. In contrast, typical individuals display a sophisticated understanding of musical structure, even in the absence of musical training. Previous research has shown that they acquire this knowledge implicitly, through exposure to music's statistical regularities. The present study tested the hypothesis that congenital amusia may result from a failure to internalize statistical regularities – specifically, lower-order transitional probabilities. To explore the specificity of any potential deficits to the musical domain, learning was examined with both tonal and linguistic material. Participants were exposed to structured tonal and linguistic sequences and, in a subsequent test phase, were required to identify items which had been heard in the exposure phase, as distinct from foils comprising elements that had been present during exposure, but presented in a different temporal order. Amusic and control individuals showed comparable learning, for both tonal and linguistic material, even when the tonal stream included pitch intervals around one semitone. However analysis of binary confidence ratings revealed that amusic individuals have less confidence in their abilities and that their performance in learning tasks may not be contingent on explicit knowledge formation or level of awareness to the degree shown in typical individuals. The current findings suggest that the difficulties amusic individuals have with real-world music cannot be accounted for by an inability to internalize lower-order statistical regularities but may arise from other factors. PMID:21779263
The use of audio-visual methods in radiology and physics courses
NASA Astrophysics Data System (ADS)
Holmberg, Peter
1987-03-01
Today's medicine utilizes sophisticated equipment for radiological, biochemical and microbiological investigation procedures and analyses. Hence it is necessary that physicians have adequate scientific and technical knowledge of the apparatus they are using so that the equipment can be used in the most effective way. Partly this knowledge is obtained from science-orientated courses in the preclinical stage of the study program for medical students. To increase the motivation to study science-courses (medical physics) audio-visual methods are used to describe diagnostic and therapeutic procedures in the clinical routines.
Nanocrystal synthesis in microfluidic reactors: where next?
Phillips, Thomas W; Lignos, Ioannis G; Maceiczyk, Richard M; deMello, Andrew J; deMello, John C
2014-09-07
The past decade has seen a steady rise in the use of microfluidic reactors for nanocrystal synthesis, with numerous studies reporting improved reaction control relative to conventional batch chemistry. However, flow synthesis procedures continue to lag behind batch methods in terms of chemical sophistication and the range of accessible materials, with most reports having involved simple one- or two-step chemical procedures directly adapted from proven batch protocols. Here we examine the current status of microscale methods for nanocrystal synthesis, and consider what role microreactors might ultimately play in laboratory-scale research and industrial production.
Species-Level Identification of Orthopoxviruses with an Oligonucleotide Microchip
Lapa, Sergey; Mikheev, Maxim; Shchelkunov, Sergei; Mikhailovich, Vladimir; Sobolev, Alexander; Blinov, Vladimir; Babkin, Igor; Guskov, Alexander; Sokunova, Elena; Zasedatelev, Alexander; Sandakhchiev, Lev; Mirzabekov, Andrei
2002-01-01
A method for species-specific detection of orthopoxviruses pathogenic for humans and animals is described. The method is based on hybridization of a fluorescently labeled amplified DNA specimen with the oligonucleotide DNA probes immobilized on a microchip (MAGIChip). The probes identify species-specific sites within the crmB gene encoding the viral analogue of tumor necrosis factor receptor, one of the most important determinants of pathogenicity in this genus of viruses. The diagnostic procedure takes 6 h and does not require any sophisticated equipment (a portable fluorescence reader can be used). PMID:11880388
Numerical realization of the variational method for generating self-trapped beams.
Duque, Erick I; Lopez-Aguayo, Servando; Malomed, Boris A
2018-03-19
We introduce a numerical variational method based on the Rayleigh-Ritz optimization principle for predicting two-dimensional self-trapped beams in nonlinear media. This technique overcomes the limitation of the traditional variational approximation in performing analytical Lagrangian integration and differentiation. Approximate soliton solutions of a generalized nonlinear Schrödinger equation are obtained, demonstrating robustness of the beams of various types (fundamental, vortices, multipoles, azimuthons) in the course of their propagation. The algorithm offers possibilities to produce more sophisticated soliton profiles in general nonlinear models.
Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.
2013-01-01
Children born very low birth weight (<1500 grams, VLBW) are at increased risk for developmental delays. Play is an important developmental outcome to the extent that child’s play and social communication are related to later development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as maternal flexibility must account for ethnic-cultural differences in order to promote toddler developmental outcomes through play paradigms. PMID:22982287
A Global Fitting Approach For Doppler Broadening Thermometry
NASA Astrophysics Data System (ADS)
Amodio, Pasquale; Moretti, Luigi; De Vizia, Maria Domenica; Gianfrani, Livio
2014-06-01
Very recently, a spectroscopic determination of the Boltzmann constant, kB, has been performed at the Second University of Naples by means of a rather sophisticated implementation of Doppler Broadening Thermometry (DBT)1. Performed on a 18O-enriched water sample, at a wavelength of 1.39 µm, the experiment has provided a value for kB with a combined uncertainty of 24 parts over 106, which is the best result obtained so far, by using an optical method. In the spectral analysis procedure, the partially correlated speed-dependent hard-collision (pC-SDHC) model was adopted. The uncertainty budget has clearly revealed that the major contributions come from the statistical uncertainty (type A) and from the uncertainty associated to the line-shape model (type B)2. In the present work, we present the first results of a theoretical and numerical work aimed at reducing these uncertainty components. It is well known that molecular line shapes exhibit clear deviations from the time honoured Voigt profile. Even in the case of a well isolated spectral line, under the influence of binary collisions, in the Doppler regime, the shape can be quite complicated by the joint occurrence of velocity-change collisions and speed-dependent effects. The partially correlated speed-dependent Keilson-Storer profile (pC-SDKS) has been recently proposed as a very realistic model, capable of reproducing very accurately the absorption spectra for self-colliding water molecules, in the near infrared3. Unfortunately, the model is so complex that it cannot be implemented into a fitting routine for the analysis of experimental spectra. Therefore, we have developed a MATLAB code to simulate a variety of H218O spectra in thermodynamic conditions identical to the one of our DBT experiment, using the pC-SDKS model. The numerical calculations to determine such a profile have a very large computational cost, resulting from a very sophisticated iterative procedure. Hence, the numerically simulated spectra (with the addition of random noise) have been used to test the validity of simplified line shape models, such as the speed-dependent Galatry (SDG) profile and pC-SDHC model. In particular, we have used the global fitting procedure that is described in Amodio et al4. Such a procedure is very effective in reducing the uncertainty resulting from statistical correlation among free parameters. Therefore, the analysis of large amounts of simulated spectra has allowed us to study the influence of the choice of the model and quantify the achievable precision and accuracy levels, at the present value of the signal-to-noise ratio. freely redistributable under the GPL http://www.gnu.org.
LIFE CYCLE IMPACT ASSESSMENT SOPHISTICATION
An international workshop was held in Brussels on 11/29-30/1998, to discuss LCIA Sophistication. LCA experts from North America, Europs, and Asia attended. Critical reviews of associated factors, including current limitations of available assessment methodologies, and comparison...
Revisiting competition in a classic model system using formal links between theory and data.
Hart, Simon P; Burgin, Jacqueline R; Marshall, Dustin J
2012-09-01
Formal links between theory and data are a critical goal for ecology. However, while our current understanding of competition provides the foundation for solving many derived ecological problems, this understanding is fractured because competition theory and data are rarely unified. Conclusions from seminal studies in space-limited benthic marine systems, in particular, have been very influential for our general understanding of competition, but rely on traditional empirical methods with limited inferential power and compatibility with theory. Here we explicitly link mathematical theory with experimental field data to provide a more sophisticated understanding of competition in this classic model system. In contrast to predictions from conceptual models, our estimates of competition coefficients show that a dominant space competitor can be equally affected by interspecific competition with a poor competitor (traditionally defined) as it is by intraspecific competition. More generally, the often-invoked competitive hierarchies and intransitivities in this system might be usefully revisited using more sophisticated empirical and analytical approaches.
Lauridsen, S M R; Norup, M S; Rossel, P J H
2007-12-01
Rationing healthcare is a difficult task, which includes preventing patients from accessing potentially beneficial treatments. Proponents of implicit rationing argue that politicians cannot resist pressure from strong patient groups for treatments and conclude that physicians should ration without informing patients or the public. The authors subdivide this specific programme of implicit rationing, or "hidden rationing", into local hidden rationing, unsophisticated global hidden rationing and sophisticated global hidden rationing. They evaluate the appropriateness of these methods of rationing from the perspectives of individual and political autonomy and conclude that local hidden rationing and unsophisticated global hidden rationing clearly violate patients' individual autonomy, that is, their right to participate in medical decision-making. While sophisticated global hidden rationing avoids this charge, the authors point out that it nonetheless violates the political autonomy of patients, that is, their right to engage in public affairs as citizens. A defence of any of the forms of hidden rationing is therefore considered to be incompatible with a defence of autonomy.
Konrad, Christopher P.
2014-01-01
Marine bivalves such as clams, mussels, and oysters are an important component of the food web, which influence nutrient dynamics and water quality in many estuaries. The role of bivalves in nutrient dynamics and, particularly, the contribution of commercial shellfish activities, are not well understood in Puget Sound, Washington. Numerous approaches have been used in other estuaries to quantify the effects of bivalves on nutrient dynamics, ranging from simple nutrient budgeting to sophisticated numerical models that account for tidal circulation, bioenergetic fluxes through food webs, and biochemical transformations in the water column and sediment. For nutrient management in Puget Sound, it might be possible to integrate basic biophysical indicators (residence time, phytoplankton growth rates, and clearance rates of filter feeders) as a screening tool to identify places where nutrient dynamics and water quality are likely to be sensitive to shellfish density and, then, apply more sophisticated methods involving in-situ measurements and simulation models to quantify those dynamics.
Anzalone, Gerald C; Glover, Alexandra G; Pearce, Joshua M
2013-04-19
The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories.
Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.
2013-01-01
The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial portable colorimeter. Performance is evaluated with commercial vials prepared for the closed reflux chemical oxygen demand (COD) method. This approach reduced the cost of reliable closed reflux COD by two orders of magnitude making it an economic alternative for the vast majority of potential users. The open-source colorimeter demonstrated good reproducibility and serves as a platform for further development and derivation of the design for other, similar purposes such as nephelometry. This approach promises unprecedented access to sophisticated instrumentation based on low-cost sensors by those most in need of it, under-developed and developing world laboratories. PMID:23604032
“Black Swans” of Hydrology: Can our Models Address the Science of Hydrologic Change?
NASA Astrophysics Data System (ADS)
Kumar, P.
2009-12-01
Coupled models of terrestrial hydrology and climate have grown in complexity leading to better understanding of the coupling between the hydrosphere, biosphere, and the climate system. During the past two decades, these models have evolved through generational changes as they have grown in sophistication in their ability to resolve spatial heterogeneity as well as vegetation dynamics and biogeochemistry. These developments have, in part, been driven by data collection efforts ranging from focused field campaigns to long-term observational networks, advances in remote sensing and other measurement technologies, along with sophisticated estimation and assimilation methods. However, the hydrologic cycle is changing leading to unexpected and unanticipated behavior through emergent dynamics and patterns that are not part of the historical milieu. Is there a new thinking that is needed to address this challenge? The goal of this talk is to draw from the modeling developments in the past two decades to foster a debate for moving forward.
A regional assessment of information technology sophistication in Missouri nursing homes.
Alexander, Gregory L; Madsen, Richard; Wakefield, Douglas
2010-08-01
To provide a state profile of information technology (IT) sophistication in Missouri nursing homes. Primary survey data were collected from December 2006 to August 2007. A descriptive, exploratory cross-sectional design was used to investigate dimensions of IT sophistication (technological, functional, and integration) related to resident care, clinical support, and administrative processes. Each dimension was used to describe the clinical domains and demographics (ownership, regional location, and bed size). The final sample included 185 nursing homes. A wide range of IT sophistication is being used in administrative and resident care management processes, but very little in clinical support activities. Evidence suggests nursing homes in Missouri are expanding use of IT beyond traditional administrative and billing applications to patient care and clinical applications. This trend is important to provide support for capabilities which have been implemented to achieve national initiatives for meaningful use of IT in health care settings.
Structural DNA nanotechnology: from design to applications.
Zadegan, Reza M; Norton, Michael L
2012-01-01
The exploitation of DNA for the production of nanoscale architectures presents a young yet paradigm breaking approach, which addresses many of the barriers to the self-assembly of small molecules into highly-ordered nanostructures via construct addressability. There are two major methods to construct DNA nanostructures, and in the current review we will discuss the principles and some examples of applications of both the tile-based and DNA origami methods. The tile-based approach is an older method that provides a good tool to construct small and simple structures, usually with multiply repeated domains. In contrast, the origami method, at this time, would appear to be more appropriate for the construction of bigger, more sophisticated and exactly defined structures.
Vibration criteria for transit systems in close proximity to university research activities
NASA Astrophysics Data System (ADS)
Wolf, Steven
2004-05-01
As some of the newer LRT projects get closer to research facilities the question arisesi ``how do you assess the potential impact of train operations on the activities within these types of facilities?'' There are several new LRT projects that have proposed alignments near or under university research facilities. The traditional ground vibration analysis at these locations is no longer valid but requires a more sophisticated approach to identifying both criteria and impact. APTA, ISO, IES, and FTA vibration criteria may not be adequate for the most sensitive activities involving single cell and nano technology research. The use of existing ambient vibration levels is evaluated as a potential criteria. A statistical approach is used to better understand how the train vibration would affect the ambient vibration levels.
Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.
Sadowski, Michael I; Grant, Chris; Fell, Tim S
2016-03-01
Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Thompson, Glenn L
2006-05-01
Sophisticated univariate outlier screening procedures are not yet available in widely used statistical packages such as SPSS. However, SPSS can accept user-supplied programs for executing these procedures. Failing this, researchers tend to rely on simplistic alternatives that can distort data because they do not adjust to cell-specific characteristics. Despite their popularity, these simple procedures may be especially ill suited for some applications (e.g., data from reaction time experiments). A user friendly SPSS Production Facility implementation of the shifting z score criterion procedure (Van Selst & Jolicoeur, 1994) is presented in an attempt to make it easier to use. In addition to outlier screening, optional syntax modules can be added that will perform tedious database management tasks (e.g., restructuring or computing means).
Variability in Humoral Immunity to Measles Vaccine: New Developments
Haralambieva, Iana H.; Kennedy, Richard B.; Ovsyannikova, Inna G.; Whitaker, Jennifer A.; Poland, Gregory A.
2015-01-01
Despite the existence of an effective measles vaccine, resurgence in measles cases in the United States and across Europe has occurred, including in individuals vaccinated with two doses of the vaccine. Host genetic factors result in inter-individual variation in measles vaccine-induced antibodies, and play a role in vaccine failure. Studies have identified HLA and non-HLA genetic influences that individually or jointly contribute to the observed variability in the humoral response to vaccination among healthy individuals. In this exciting era, new high-dimensional approaches and techniques including vaccinomics, systems biology, GWAS, epitope prediction and sophisticated bioinformatics/statistical algorithms, provide powerful tools to investigate immune response mechanisms to the measles vaccine. These might predict, on an individual basis, outcomes of acquired immunity post measles vaccination. PMID:26602762
Work-family Conflict and Alcohol Use: Examination of a Moderated Mediation Model
Wolff, Jennifer M.; Rospenda, Kathleen M.; Richman, Judith A.; Liu, Li; Milner, Lauren A.
2013-01-01
Research consistently documents the negative effects of work-family conflict; however, little focuses on alcohol use. This study embraces a tension-reduction theory of drinking, wherein alcohol use is thought to reduce the negative effects of stress. The purpose of the present study was to test a moderated mediation model of the relationship between work-family conflict and alcohol use in a Chicagoland community sample of 998 caregivers. Structural equation models showed that distress mediated the relationship between work-family conflict and alcohol use. Furthermore, tension reduction expectancies of alcohol exacerbated the relationship between distress and alcohol use. The results advance the study of work-family conflict and alcohol use, helping explain this complicated relationship using sophisticated statistical techniques. Implications for theory and practice are discussed. PMID:23480251
Work-family conflict and alcohol use: examination of a moderated mediation model.
Wolff, Jennifer M; Rospenda, Kathleen M; Richman, Judith A; Liu, Li; Milner, Lauren A
2013-01-01
Research consistently documents the negative effects of work-family conflict; however, little research focuses on alcohol use. This study embraces a tension reduction theory of drinking, wherein alcohol use is thought to reduce the negative effects of stress. The purpose of the study was to test a moderated mediation model of the relationship between work-family conflict and alcohol use in a Chicagoland community sample of 998 caregivers. Structural equation models showed that distress mediated the relationship between work-family conflict and alcohol use. Furthermore, tension reduction expectancies of alcohol exacerbated the relationship between distress and alcohol use. The results advance the study of work-family conflict and alcohol use, helping explain this complicated relationship using sophisticated statistical techniques. Implications for theory and practice are discussed.
Laser Doppler velocimetry primer
NASA Technical Reports Server (NTRS)
Bachalo, William D.
1985-01-01
Advanced research in experimental fluid dynamics required a familiarity with sophisticated measurement techniques. In some cases, the development and application of new techniques is required for difficult measurements. Optical methods and in particular, the laser Doppler velocimeter (LDV) are now recognized as the most reliable means for performing measurements in complex turbulent flows. And such, the experimental fluid dynamicist should be familiar with the principles of operation of the method and the details associated with its application. Thus, the goals of this primer are to efficiently transmit the basic concepts of the LDV method to potential users and to provide references that describe the specific areas in greater detail.
Strategies for Optimizing Strength, Power, and Muscle Hypertrophy in Women.
1997-09-01
the injury risks and inefficiencies of other methods for the more sophisticated assessment of human muscular strength and power. To provide...an environment of total safety. Limiting catches prevent injury through falling or loss of control of the loaded bar and a specially designed...J., Rodman, K.W., and Sebolt, D.R. The effect of endurance running on training adaptations in women participating in a weightlifting program. J
QUALITY CONTROL OF PHARMACEUTICALS.
LEVI, L; WALKER, G C; PUGSLEY, L I
1964-10-10
Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed.
A Time-of-Flight Method to Measure the Speed of Sound Using a Stereo Sound Card
ERIC Educational Resources Information Center
Carvalho, Carlos C.; dos Santos, J. M. B. Lopes; Marques, M. B.
2008-01-01
Most homes in developed countries have a sophisticated data acquisition board, namely the PC sound board. Designed to be able to reproduce CD-quality stereo sound, it must have a sampling rate of at least 44 kHz and have very accurate timing between the two stereo channels. With a very simple adaptation of a pair of regular PC microphones, a…
Analysis of Vertiport Studies Funded by the Airport Improvement Program (AIP)
1994-05-01
the general population and travel behavior factors from surveys and other sources. FEASIBILITY The vertiport studies recognize the need to address the ... behavior factors obtained from surveys and other sources. All of the methods were dependent upon various secondary data and/or information sources that...economic responses and of travel behavior . The five types, in order of increasing analytical sophistication, are briefly identified as follows. I
Environmental Restoration - Expedient Methods and Technologies: A User Guide with Case Studies
1998-03-01
benzene, high fructose corn syrup , raw molasses, butane gas, sodium benzoate, or acetate. Enhanced anaerobic biodegradation of jet fuels in ground water...appendix discusses technology applications that are deemed impractical because of high cost, difficulty of use, or other factors. Also included is a...conversations with knowledgeable 1 Technologically sophisticated processes are not addressed in this study because of high cost, which includes the engineering
Technical and Scientific Evaluation of EM-APEX in Hurricane Frances
2006-09-30
as part of the 2004 CBLAST experiment (Figure 1). Four of these initial floats were deployed again in the 2005 EDDIES experiment (NSF) near Bermuda . In... triangles indicate ascending and descending profiles, respectively. Circles indicate 500 mn deep profiles, while the rest are 200 mn deep. The figure shows...times marked with triangles can be used to reconstruct surface wave properties using more sophisticated methods. RESULTS Technical Results. a
Use of 16S rRNA gene for identification of a broad range of clinically relevant bacterial pathogens
Srinivasan, Ramya; Karaoz, Ulas; Volegova, Marina; ...
2015-02-06
According to World Health Organization statistics of 2011, infectious diseases remain in the top five causes of mortality worldwide. However, despite sophisticated research tools for microbial detection, rapid and accurate molecular diagnostics for identification of infection in humans have not been extensively adopted. Time-consuming culture-based methods remain to the forefront of clinical microbial detection. The 16S rRNA gene, a molecular marker for identification of bacterial species, is ubiquitous to members of this domain and, thanks to ever-expanding databases of sequence information, a useful tool for bacterial identification. In this study, we assembled an extensive repository of clinical isolates (n =more » 617), representing 30 medically important pathogenic species and originally identified using traditional culture-based or non-16S molecular methods. This strain repository was used to systematically evaluate the ability of 16S rRNA for species level identification. To enable the most accurate species level classification based on the paucity of sequence data accumulated in public databases, we built a Naïve Bayes classifier representing a diverse set of high-quality sequences from medically important bacterial organisms. We show that for species identification, a model-based approach is superior to an alignment based method. Overall, between 16S gene based and clinical identities, our study shows a genus-level concordance rate of 96% and a species-level concordance rate of 87.5%. We point to multiple cases of probable clinical misidentification with traditional culture based identification across a wide range of gram-negative rods and gram-positive cocci as well as common gram-negative cocci.« less
Use of 16S rRNA Gene for Identification of a Broad Range of Clinically Relevant Bacterial Pathogens
Srinivasan, Ramya; Karaoz, Ulas; Volegova, Marina; MacKichan, Joanna; Kato-Maeda, Midori; Miller, Steve; Nadarajan, Rohan; Brodie, Eoin L.; Lynch, Susan V.
2015-01-01
According to World Health Organization statistics of 2011, infectious diseases remain in the top five causes of mortality worldwide. However, despite sophisticated research tools for microbial detection, rapid and accurate molecular diagnostics for identification of infection in humans have not been extensively adopted. Time-consuming culture-based methods remain to the forefront of clinical microbial detection. The 16S rRNA gene, a molecular marker for identification of bacterial species, is ubiquitous to members of this domain and, thanks to ever-expanding databases of sequence information, a useful tool for bacterial identification. In this study, we assembled an extensive repository of clinical isolates (n = 617), representing 30 medically important pathogenic species and originally identified using traditional culture-based or non-16S molecular methods. This strain repository was used to systematically evaluate the ability of 16S rRNA for species level identification. To enable the most accurate species level classification based on the paucity of sequence data accumulated in public databases, we built a Naïve Bayes classifier representing a diverse set of high-quality sequences from medically important bacterial organisms. We show that for species identification, a model-based approach is superior to an alignment based method. Overall, between 16S gene based and clinical identities, our study shows a genus-level concordance rate of 96% and a species-level concordance rate of 87.5%. We point to multiple cases of probable clinical misidentification with traditional culture based identification across a wide range of gram-negative rods and gram-positive cocci as well as common gram-negative cocci. PMID:25658760
Kademoglou, Katerina; Williams, Adrian C; Collins, Chris D
2018-04-15
Human uptake of flame retardants (FRs) such as polybrominated diphenyl ethers (PBDEs) via indoor dust ingestion is commonly considered as 100% bioaccessible, leading to potential risk overestimation. Here, we present a novel in vitro colon-extended physiologically-based extraction test (CE-PBET) with Tenax TA® as an absorptive "sink" capable to enhance PBDE gut bioaccessibility. A cellulose-based dialysis membrane (MW cut-off 3.5kDa) with high pH and temperature tolerance was used to encapsulate Tenax TA®, facilitating efficient physical separation between the absorbent and the dust, while minimizing re-absorption of the ingested PBDEs to the dust particles. As a proof of concept, PBDE-spiked indoor dust samples (n=3) were tested under four different conditions; without any Tenax TA® addition (control) and with three different Tenax TA® loadings (i.e. 0.25, 0.5 or 0.75g). Our results show that in order to maintain a constant sorptive gradient for the low MW PBDEs, 0.5g of Tenax TA® are required in CE-PBET. Tenax TA® inclusion (0.5g) resulted in 40% gut bioaccessibility for BDE153 and BDE183, whereas greater bioaccessibility values were seen for less hydrophobic PBDEs such as BDE28 and BDE47 (~60%). When tested using SRM 2585 (n=3), our new Tenax TA® method did not present any statistically significant effect (p>0.05) between non-spiked and PBDE-spiked SRM 2585 treatments. Our study describes an efficient method where due to the sophisticated design, Tenax TA® recovery and subsequent bioaccessibility determination can be simply and reliably achieved. Copyright © 2017 Elsevier B.V. All rights reserved.
How to select electrical end-use meters for proper measurement of DSM impact estimates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, M.
1994-12-31
Does metering actually provide higher accuracy impact estimates? The answer is sometimes yes, sometimes no. It depends on how the metered data will be used. DSM impact estimates can be achieved in a variety of ways, including engineering algorithms, modeling and statistical methods. Yet for all of these methods, impacts can be calculated as the difference in pre- and post-installation annual load shapes. Increasingly, end-use metering is being used to either adjust and calibrate a particular estimate method, or measure load shapes directly. It is therefore not surprising that metering has become synonymous with higher accuracy impact estimates. If meteredmore » data is used as a component in an estimating methodology, its relative contribution to accuracy can be analyzed through propagation of error or {open_quotes}POE{close_quotes} analysis. POE analysis is a framework which can be used to evaluate different metering options and their relative effects on cost and accuracy. If metered data is used to directly measure pre- and post-installation load shapes to calculate energy and demand impacts, then the accuracy of the whole metering process directly affects the accuracy of the impact estimate. This paper is devoted to the latter case, where the decision has been made to collect high-accuracy metered data of electrical energy and demand. The underlying assumption is that all meters can yield good results if applied within the scope of their limitations. The objective is to know the application, understand what meters are actually doing to measure and record power, and decide with confidence when a sophisticated meter is required, and when a less expensive type will suffice.« less
Tensile properties of the transverse carpal ligament and carpal tunnel complex.
Ugbolue, Ukadike C; Gislason, Magnus K; Carter, Mark; Fogg, Quentin A; Riches, Philip E; Rowe, Philip J
2015-08-01
A new sophisticated method that uses video analysis techniques together with a Maillon Rapide Delta to determine the tensile properties of the transverse carpal ligament-carpal tunnel complex has been developed. Six embalmed cadaveric specimens amputated at the mid-forearm and aged (mean (SD)): 82 (6.29) years were tested. The six hands were from three males (four hands) and one female (two hands). Using trigonometry and geometry the elongation and strain of the transverse carpal ligament and carpal arch were calculated. The cross-sectional area of the transverse carpal ligament was determined. Tensile properties of the transverse carpal ligament-carpal tunnel complex and Load-Displacement data were also obtained. Descriptive statistics, one-way ANOVA together with a post-hoc analysis (Tukey) and t-tests were incorporated. A transverse carpal ligament-carpal tunnel complex novel testing method has been developed. The results suggest that there were no significant differences between the original transverse carpal ligament width and transverse carpal ligament at peak elongation (P=0.108). There were significant differences between the original carpal arch width and carpal arch width at peak elongation (P=0.002). The transverse carpal ligament failed either at the mid-substance or at their bony attachments. At maximum deformation the peak load and maximum transverse carpal ligament displacements ranged from 285.74N to 1369.66N and 7.09mm to 18.55mm respectively. The transverse carpal ligament cross-sectional area mean (SD) was 27.21 (3.41)mm(2). Using this method the results provide useful biomechanical information and data about the tensile properties of the transverse carpal ligament-carpal tunnel complex. Copyright © 2015 Elsevier Ltd. All rights reserved.
Dynamic Gate Product and Artifact Generation from System Models
NASA Technical Reports Server (NTRS)
Jackson, Maddalena; Delp, Christopher; Bindschadler, Duane; Sarrel, Marc; Wollaeger, Ryan; Lam, Doris
2011-01-01
Model Based Systems Engineering (MBSE) is gaining acceptance as a way to formalize systems engineering practice through the use of models. The traditional method of producing and managing a plethora of disjointed documents and presentations ("Power-Point Engineering") has proven both costly and limiting as a means to manage the complex and sophisticated specifications of modern space systems. We have developed a tool and method to produce sophisticated artifacts as views and by-products of integrated models, allowing us to minimize the practice of "Power-Point Engineering" from model-based projects and demonstrate the ability of MBSE to work within and supersede traditional engineering practices. This paper describes how we have created and successfully used model-based document generation techniques to extract paper artifacts from complex SysML and UML models in support of successful project reviews. Use of formal SysML and UML models for architecture and system design enables production of review documents, textual artifacts, and analyses that are consistent with one-another and require virtually no labor-intensive maintenance across small-scale design changes and multiple authors. This effort thus enables approaches that focus more on rigorous engineering work and less on "PowerPoint engineering" and production of paper-based documents or their "office-productivity" file equivalents.
AN INTERNATIONAL WORKSHOP ON LIFE CYCLE IMPACT ASSESSMENT SOPHISTICATION
On November 29-30,1998 in Brussels, an international workshop was held to discuss Life Cycle Impact Assessment (LCIA) Sophistication. Approximately 50 LCA experts attended the workshop from North America, Europe, and Asia. Prominant practicioners and researchers were invited to ...
2007-06-15
Al Qaeda is a product of the forces of globalization. Increasing access to global finances , international travel, and sophisticated technology is...evolution. Al Qaeda is a product of the forces of globalization. Increasing access to global finances , international travel, and sophisticated technology...75 Finance
Information technology sophistication in nursing homes.
Alexander, Gregory L; Wakefield, Douglas S
2009-07-01
There is growing recognition that a more sophisticated information technology (IT) infrastructure is needed to improve the quality of nursing home care in the United States. The purpose of this study was to explore the concept of IT sophistication in nursing homes considering the level of technological diversity, maturity and level of integration in resident care, clinical support, and administration. Twelve IT stakeholders were interviewed from 4 nursing homes considered to have high IT sophistication using focus groups and key informant interviews. Common themes were derived using qualitative analytics and axial coding from field notes collected during interviews and focus groups. Respondents echoed the diversity of the innovative IT systems being implemented; these included resident alerting mechanisms for clinical decision support, enhanced reporting capabilities of patient-provider interactions, remote monitoring, and networking among affiliated providers. Nursing home IT is in its early stages of adoption; early adopters are beginning to realize benefits across clinical domains including resident care, clinical support, and administrative activities. The most important thread emerging from these discussions was the need for further interface development between IT systems to enhance integrity and connectivity. The study shows that some early adopters of sophisticated IT systems in nursing homes are beginning to achieve added benefit for resident care, clinical support, and administrative activities.
Office-based elastographic technique for quantifying mechanical properties of skeletal muscle.
Ballyns, Jeffrey J; Turo, Diego; Otto, Paul; Shah, Jay P; Hammond, Jennifer; Gebreab, Tadesse; Gerber, Lynn H; Sikdar, Siddhartha
2012-08-01
Our objectives were to develop a new, efficient, and easy-to-administer approach to ultrasound elastography and assess its ability to provide quantitative characterization of viscoelastic properties of skeletal muscle in an outpatient clinical environment. We sought to show its validity and clinical utility in assessing myofascial trigger points, which are associated with myofascial pain syndrome. Ultrasound imaging was performed while the muscle was externally vibrated at frequencies in the range of 60 to 200 Hz using a handheld vibrator. The spatial gradient of the vibration phase yielded the shear wave speed, which is related to the viscoelastic properties of tissue. The method was validated using a calibrated experimental phantom, the biceps brachii muscle in healthy volunteers (n = 6), and the upper trapezius muscle in symptomatic patients with axial neck pain (n = 13) and asymptomatic (pain-free) control participants (n = 9). Using the experimental phantom, our method was able to quantitatively measure the shear moduli with error rates of less than 20%. The mean shear modulus ± SD in the normal biceps brachii measured 12.5 ± 3.4 kPa, within the range of published values using more sophisticated methods. Shear wave speeds in active myofascial trigger points and the surrounding muscle tissue were significantly higher than those in normal tissue at high frequency excitations (>100 Hz; P < .05). Off-the-shelf office-based equipment can be used to quantitatively characterize skeletal muscle viscoelastic properties with estimates comparable to those using more sophisticated methods. Our preliminary results using this method indicate that patients with spontaneous neck pain and symptomatic myofascial trigger points have increased tissue heterogeneity at the trigger point site and the surrounding muscle tissue.
Reliability-Based Model to Analyze the Performance and Cost of a Transit Fare Collection System.
DOT National Transportation Integrated Search
1985-06-01
The collection of transit system fares has become more sophisticated in recent years, with more flexible structures requiring more sophisticated fare collection equipment to process tickets and admit passengers. However, this new and complex equipmen...
Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index
Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy
2012-01-01
Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124